Showing stories 1-25 out of 63 stories. 1 | 2 | 3>>>
19-Jun-2015 SLAC research resumes at upgraded Large Hadron Collider
Research with the Large Hadron Collider has officially resumed: The world's largest particle accelerator at CERN began on June 3 to collect data at a new record energy that could hold the key to new scientific discoveries. To keep up with the boost in performance, researchers at the Department of Energy's SLAC National Accelerator Laboratory have developed new technologies for ATLAS -- one of two experiments involved in the 2012 discovery of the Higgs boson.
15-Jun-2015 What the blank makes quantum dots blink?
Quantum dots promise an astounding range of applications, if scientists can conquer their annoying habit of blinking. Researchers computing at NERSC recently ran simulations that offer new insights into the problem.
1-Jun-2015 Meraculous: Deciphering the 'book of life' with supercomputers
A team of scientists from Berkeley Lab, JGI and UC Berkeley, simplified and sped up genome assembly, reducing a months-long process to mere minutes. This was primarily achieved by 'parallelizing' the code to harness the processing power of supercomputers, such as NERSC's Edison system.
27-May-2015 The 'why' of models
An international team of researchers from Oak Ridge National Laboratory, Macquarie University, the University of Western Sydney and the Max Planck Institute for Biogeochemistry set out to assess how two Free-Air CO2 Enrichment projects compared to eleven vegetation models that simulate various ecological processes. Instead of only benchmarking whether or not an individual model matched the experimental data, the researchers developed an 'assumption-centered' approach to evaluate why certain models performed better than others.
13-May-2015 Digitizing neurons
Supercomputing resources at Oak Ridge National Laboratory will support a new initiative designed to advance how scientists digitally reconstruct and analyze individual neurons in the human brain.
11-May-2015 'Chombo-crunch' sinks its teeth into fluid dynamics
Researchers at Lawrence Berkeley National Laboratory are breaking new ground in the modeling of complex flows in energy and oil and gas applications, thanks to a computational fluid dynamics and transport code dubbed 'Chombo-Crunch.'
8-May-2015 New method relates Greenland ice sheet changes to sea-level rise
Climate models are not yet able to include full models of the Greenland and Antarctic ice sheets and to dynamically simulate how ice sheet changes influence sea level. Early schemes failed to accurately account for mass increase due to snowfall and mass loss due to snow melt. These changes depend on ice sheet elevation and region. A new method that includes the effects of elevation and region was developed.
1-May-2015 New mathematical method enhances hydrology simulations
Just as a racecar's engine needs the right fuel to get the best performance, so climate models need finely tuned parameters to accurately simulate the impacts of different technologies and policies. Led by researchers at Pacific Northwest National Laboratory, a team applied sophisticated mathematical solutions to fine tune the water and energy exchange parameters, numerical stand-ins for complex processes, to better simulate water and energy fluxes.
31-Mar-2015 BigNeuron: Unlocking the secrets of the human brain
To find a standard 3-D neuron reconstruction algorithm, Big Neuron will sponsor a series of international hackathons and workshops where contending algorithms will be ported onto a common software platform to analyze neuronal physical structure using the same core dataset. All ported algorithms will be bench-tested at DOE's NERSC and ORNL, as well as Human Brain Project supercomputing centers.
25-Mar-2015 Protein shake-up
For living organisms proteins are an essential part of their body system and are needed to thrive. In recent years, a certain class of proteins has challenged researchers' conventional notion that proteins have a static and well-defined structure.
20-Mar-2015 Organic photovoltaics experiments showcase HPC 'superfacility' concept
A collaborative effort linking the Advanced Light Source at Lawrence Berkeley National Laboratory with supercomputing resources at NERSC and the Oak Ridge Leadership Computing Facility is yielding exciting results in organic photovoltaics research that could transform the way researchers use these facilities and improve scientific productivity in the process.
18-Mar-2015 Nanostructure complex materials modeling
Brookhaven physicist Simon Billinge illustrates how advances in computing and applied mathematics can improve the predictive value of models used to design new materials.
17-Mar-2015 Granular data processing on HPCs using an event service
Brookhaven Lab/ATLAS physicist Torre Wenaus describes an effort to trickle small 'grains' of data generated by the ATLAS experiment at the Large Hadron Collider (LHC) in Europe into small pockets of unused supercomputing time, sandwiched between big jobs on high-performance supercomputers.
5-Mar-2015 Between micro and macro, Berkeley Lab mathematicians model fluids at the mesoscale
The math whizzes at the Berkeley Lab's Center for Computational Sciences and Engineering are at the forefront of a neglected corner of the scientific world, building mathematical models for fluids at the mesoscale. The little-known field of fluctuating hydrodynamics could have enormous impacts in applications ranging from batteries to drug delivery to microfluidic devices.
20-Jan-2015 Pinpointing the magnetic moments of nuclear matter
Using supercomputing resources at the National Energy Research Scientific Computing Center at Berkeley Lab, a team of nuclear physicists has demonstrated for the first time the ability of quantum chromodynamics (QCD) -- a fundamental theory in particle physics -- to calculate the magnetic structure of some of the lightest nuclei. Their findings are part of an ongoing effort to further our understanding of the universe.
16-Dec-2014 A standard for neuroscience data
In November, Neurodata without Borders hosted a hackathon to consolidate ideas for designing and implementing a standard neuroscience file format. And BrainFormat, a neuroscience data standardization framework developed at Berkeley Lab, was one of several candidates selected for further investigation. It is now a strong contender to contribute to the development of a community-wide data format and storage standard for the neuroscience research community.
1-Dec-2014 Optimized algorithms boost combustion research
Turbulent combustion simulations, used in the design of more fuel-efficient combustion systems, have gotten their own efficiency boost, thanks to researchers from Berkeley Lab's Computational Research Division. They developed new algorithmic features that streamline turbulent flame simulations, which are commonly used in the design of combustion systems such as diesel engines; after testing the enhanced code on NERSC supercomputers, they were able to achieve dramatic improvements in simulation times, which will help reduce the time -- and thus the cost -- of designing new engines.
24-Nov-2014 Berkeley Lab algorithms help researchers understand dark energy
To unlock the mystery of dark energy and its influence on the universe, researchers must rely on indirect observations -- watching how fast Type Ia supernovae recede from us as the universe expands. The process of identifying and tracking these objects requires scientists to scrupulously monitor the night sky for slight changes, a task that would be tedious and time-consuming for the Dark Energy Survey without novel tools developed by Berkeley Lab and UC Berkeley researchers.
19-Nov-2014 Fast company
Researchers answering fundamental scientific questions in biology, climate and chemistry look to high performance computing and robust software. With its history of integrating experiment and computation, EMSL supports research into climate change, contaminated soil remediation, and energy production and storage with its Cascade supercomputer and enhanced NWChem computational chemistry software.
17-Nov-2014 Spiraling back in time
Using a code developed for GPU supercomputing architectures, including that of the Department of Energy's Oak Ridge National Laboratory's Cray XK7 Titan, to simulate the evolution of the Milky Way galaxy, a team of researchers from the Netherlands and Japan is a Gordon Bell Prize finalist.
22-Oct-2014 Brookhaven Lab launches Computational Science Initiative
NY-Building on its capabilities in computational science and data management, the US Department of Energy's Brookhaven National Laboratory is embarking upon a major new Computational Science Initiative. This program will leverage computational science expertise and investments across multiple programs at the Laboratory-including the flagship facilities that attract thousands of scientific users each year-further establishing Brookhaven as a leader in tackling the 'big data' challenges at the frontiers of scientific discovery.
The Department of Energy's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time.