Showing stories 26-50 out of 105 stories. <<<1 | 2 | 3 | 4 | 5>>>
1-May-2015 New mathematical method enhances hydrology simulations
Just as a racecar's engine needs the right fuel to get the best performance, so climate models need finely tuned parameters to accurately simulate the impacts of different technologies and policies. Led by researchers at Pacific Northwest National Laboratory, a team applied sophisticated mathematical solutions to fine tune the water and energy exchange parameters, numerical stand-ins for complex processes, to better simulate water and energy fluxes.
31-Mar-2015 BigNeuron: Unlocking the secrets of the human brain
To find a standard 3-D neuron reconstruction algorithm, Big Neuron will sponsor a series of international hackathons and workshops where contending algorithms will be ported onto a common software platform to analyze neuronal physical structure using the same core dataset. All ported algorithms will be bench-tested at DOE's NERSC and ORNL, as well as Human Brain Project supercomputing centers.
25-Mar-2015 Protein shake-up
For living organisms proteins are an essential part of their body system and are needed to thrive. In recent years, a certain class of proteins has challenged researchers' conventional notion that proteins have a static and well-defined structure.
20-Mar-2015 Organic photovoltaics experiments showcase HPC 'superfacility' concept
A collaborative effort linking the Advanced Light Source at Lawrence Berkeley National Laboratory with supercomputing resources at NERSC and the Oak Ridge Leadership Computing Facility is yielding exciting results in organic photovoltaics research that could transform the way researchers use these facilities and improve scientific productivity in the process.
18-Mar-2015 Nanostructure complex materials modeling
Brookhaven physicist Simon Billinge illustrates how advances in computing and applied mathematics can improve the predictive value of models used to design new materials.
17-Mar-2015 Granular data processing on HPCs using an event service
Brookhaven Lab/ATLAS physicist Torre Wenaus describes an effort to trickle small 'grains' of data generated by the ATLAS experiment at the Large Hadron Collider (LHC) in Europe into small pockets of unused supercomputing time, sandwiched between big jobs on high-performance supercomputers.
5-Mar-2015 Between micro and macro, Berkeley Lab mathematicians model fluids at the mesoscale
The math whizzes at the Berkeley Lab's Center for Computational Sciences and Engineering are at the forefront of a neglected corner of the scientific world, building mathematical models for fluids at the mesoscale. The little-known field of fluctuating hydrodynamics could have enormous impacts in applications ranging from batteries to drug delivery to microfluidic devices.
20-Jan-2015 Pinpointing the magnetic moments of nuclear matter
Using supercomputing resources at the National Energy Research Scientific Computing Center at Berkeley Lab, a team of nuclear physicists has demonstrated for the first time the ability of quantum chromodynamics (QCD) -- a fundamental theory in particle physics -- to calculate the magnetic structure of some of the lightest nuclei. Their findings are part of an ongoing effort to further our understanding of the universe.
16-Dec-2014 A standard for neuroscience data
In November, Neurodata without Borders hosted a hackathon to consolidate ideas for designing and implementing a standard neuroscience file format. And BrainFormat, a neuroscience data standardization framework developed at Berkeley Lab, was one of several candidates selected for further investigation. It is now a strong contender to contribute to the development of a community-wide data format and storage standard for the neuroscience research community.
1-Dec-2014 Optimized algorithms boost combustion research
Turbulent combustion simulations, used in the design of more fuel-efficient combustion systems, have gotten their own efficiency boost, thanks to researchers from Berkeley Lab's Computational Research Division. They developed new algorithmic features that streamline turbulent flame simulations, which are commonly used in the design of combustion systems such as diesel engines; after testing the enhanced code on NERSC supercomputers, they were able to achieve dramatic improvements in simulation times, which will help reduce the time -- and thus the cost -- of designing new engines.
24-Nov-2014 Berkeley Lab algorithms help researchers understand dark energy
To unlock the mystery of dark energy and its influence on the universe, researchers must rely on indirect observations -- watching how fast Type Ia supernovae recede from us as the universe expands. The process of identifying and tracking these objects requires scientists to scrupulously monitor the night sky for slight changes, a task that would be tedious and time-consuming for the Dark Energy Survey without novel tools developed by Berkeley Lab and UC Berkeley researchers.
19-Nov-2014 Fast company
Researchers answering fundamental scientific questions in biology, climate and chemistry look to high performance computing and robust software. With its history of integrating experiment and computation, EMSL supports research into climate change, contaminated soil remediation, and energy production and storage with its Cascade supercomputer and enhanced NWChem computational chemistry software.
17-Nov-2014 Spiraling back in time
Using a code developed for GPU supercomputing architectures, including that of the Department of Energy's Oak Ridge National Laboratory's Cray XK7 Titan, to simulate the evolution of the Milky Way galaxy, a team of researchers from the Netherlands and Japan is a Gordon Bell Prize finalist.
22-Oct-2014 Brookhaven Lab launches Computational Science Initiative
NY-Building on its capabilities in computational science and data management, the US Department of Energy's Brookhaven National Laboratory is embarking upon a major new Computational Science Initiative. This program will leverage computational science expertise and investments across multiple programs at the Laboratory-including the flagship facilities that attract thousands of scientific users each year-further establishing Brookhaven as a leader in tackling the 'big data' challenges at the frontiers of scientific discovery.
17-Oct-2014 Atomic trigger shatters mystery of how glass deforms
A new study at the Department of Energy's Oak Ridge National Laboratory, published Sept. 24 in Nature Communications, has cracked one mystery of glass to shed light on the mechanism that triggers its deformation before shattering. The study improves understanding of glassy deformation and may accelerate broader application of metallic glass, a moldable, wear-resistant, magnetically exploitable material that is thrice as strong as the mightiest steel and ten times as springy.
17-Sep-2014 Predicting performance
Lignin, a low-cost byproduct of the pulp, paper and biofuels industries, can be transformed into a cheaper version of highly engineered graphite through a simple and industrially scalable manufacturing process.
28-Aug-2014 Materials scientists play atomic 'Jenga' and make a surprising discovery
Researchers got a surprise when they built a highly ordered lattice by layering thin films containing lanthanum, strontium, oxygen and iron. Although each layer had an intrinsically nonpolar (symmetric) distribution of electrical charges, the lattice had an asymmetric distribution of charges. The charge asymmetry creates an extra 'switch' that brings functionalities to materials when 'flipped' by external stimuli. The material defects induced polar behavior and can provide a new mechanism for manipulating electricity and magnetism in energy and information technologies.
The Department of Energy's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time.