Showing stories 26-50 out of 117 stories. <<<1 | 2 | 3 | 4 | 5>>>
29-Sep-2015 Titan helps unpuzzle decades-old plutonium perplexities
First produced in 1940, plutonium is one of the most electronically complicated elements on Earth -- and because of its complexities, scientists have been struggling to prove the existence of its magnetic properties ever since. Finally, that struggle is over, thanks to a timely combination of theory, algorithm and code developments, neutrons experiments, and Titan -- the second-most-powerful supercomputer in the world.
20-Aug-2015 Carbon number crunching
A booming economy and population led China to emerge in 2006 as the global leader in fossil-fuel carbon emissions, a distinction it still maintains. But exactly how much carbon China releases has been a topic of debate, with recent estimates varying by as much as 15 percent.
18-Aug-2015 Viral comparisons
An Oak Ridge National Laboratory team of comparative genomics and computational science researchers compared approximately 4,000 complete virus genomes downloaded from a public database known as GenBank. By compressing the sequence files, the team created a virus dendrogram that maps out the relationships among all the different virus families.
11-Aug-2015 Eyes on the prize
Recently, the Department of Energy Office of Science's Nanoscale Science Research Centers at Argonne, Brookhaven, Lawrence Berkeley, Los Alamos/Sandia and Oak Ridge national laboratories jointly organized a workshop at Oak Ridge National Laboratory to discuss opportunities and challenges as imaging and data sciences merge. Those efforts will likely aid the Materials Genome Initiative, which aims to speed new materials to the global marketplace.
7-Jul-2015 Big PanDA and Titan merge to tackle torrent of LHC's full-energy collision data
With the successful restart of the Large Hadron Collider, now operating at nearly twice its former collision energy, comes an enormous increase in the volume of data physicists must sift through to search for new discoveries. Fortunately, a remarkable data-management tool developed by physicists at Brookhaven National Laboratory and the University of Texas at Arlington is evolving to meet the big-data challenge.
19-Jun-2015 SLAC research resumes at upgraded Large Hadron Collider
Research with the Large Hadron Collider has officially resumed: The world's largest particle accelerator at CERN began on June 3 to collect data at a new record energy that could hold the key to new scientific discoveries. To keep up with the boost in performance, researchers at the Department of Energy's SLAC National Accelerator Laboratory have developed new technologies for ATLAS -- one of two experiments involved in the 2012 discovery of the Higgs boson.
15-Jun-2015 What the blank makes quantum dots blink?
Quantum dots promise an astounding range of applications, if scientists can conquer their annoying habit of blinking. Researchers computing at NERSC recently ran simulations that offer new insights into the problem.
1-Jun-2015 Meraculous: Deciphering the 'book of life' with supercomputers
A team of scientists from Berkeley Lab, JGI and UC Berkeley, simplified and sped up genome assembly, reducing a months-long process to mere minutes. This was primarily achieved by 'parallelizing' the code to harness the processing power of supercomputers, such as NERSC's Edison system.
27-May-2015 The 'why' of models
An international team of researchers from Oak Ridge National Laboratory, Macquarie University, the University of Western Sydney and the Max Planck Institute for Biogeochemistry set out to assess how two Free-Air CO2 Enrichment projects compared to eleven vegetation models that simulate various ecological processes. Instead of only benchmarking whether or not an individual model matched the experimental data, the researchers developed an 'assumption-centered' approach to evaluate why certain models performed better than others.
13-May-2015 Digitizing neurons
Supercomputing resources at Oak Ridge National Laboratory will support a new initiative designed to advance how scientists digitally reconstruct and analyze individual neurons in the human brain.
11-May-2015 'Chombo-crunch' sinks its teeth into fluid dynamics
Researchers at Lawrence Berkeley National Laboratory are breaking new ground in the modeling of complex flows in energy and oil and gas applications, thanks to a computational fluid dynamics and transport code dubbed 'Chombo-Crunch.'
8-May-2015 New method relates Greenland ice sheet changes to sea-level rise
Climate models are not yet able to include full models of the Greenland and Antarctic ice sheets and to dynamically simulate how ice sheet changes influence sea level. Early schemes failed to accurately account for mass increase due to snowfall and mass loss due to snow melt. These changes depend on ice sheet elevation and region. A new method that includes the effects of elevation and region was developed.
1-May-2015 New mathematical method enhances hydrology simulations
Just as a racecar's engine needs the right fuel to get the best performance, so climate models need finely tuned parameters to accurately simulate the impacts of different technologies and policies. Led by researchers at Pacific Northwest National Laboratory, a team applied sophisticated mathematical solutions to fine tune the water and energy exchange parameters, numerical stand-ins for complex processes, to better simulate water and energy fluxes.
31-Mar-2015 BigNeuron: Unlocking the secrets of the human brain
To find a standard 3-D neuron reconstruction algorithm, Big Neuron will sponsor a series of international hackathons and workshops where contending algorithms will be ported onto a common software platform to analyze neuronal physical structure using the same core dataset. All ported algorithms will be bench-tested at DOE's NERSC and ORNL, as well as Human Brain Project supercomputing centers.
25-Mar-2015 Protein shake-up
For living organisms proteins are an essential part of their body system and are needed to thrive. In recent years, a certain class of proteins has challenged researchers' conventional notion that proteins have a static and well-defined structure.
20-Mar-2015 Organic photovoltaics experiments showcase HPC 'superfacility' concept
A collaborative effort linking the Advanced Light Source at Lawrence Berkeley National Laboratory with supercomputing resources at NERSC and the Oak Ridge Leadership Computing Facility is yielding exciting results in organic photovoltaics research that could transform the way researchers use these facilities and improve scientific productivity in the process.
18-Mar-2015 Nanostructure complex materials modeling
Brookhaven physicist Simon Billinge illustrates how advances in computing and applied mathematics can improve the predictive value of models used to design new materials.
17-Mar-2015 Granular data processing on HPCs using an event service
Brookhaven Lab/ATLAS physicist Torre Wenaus describes an effort to trickle small 'grains' of data generated by the ATLAS experiment at the Large Hadron Collider (LHC) in Europe into small pockets of unused supercomputing time, sandwiched between big jobs on high-performance supercomputers.
5-Mar-2015 Between micro and macro, Berkeley Lab mathematicians model fluids at the mesoscale
The math whizzes at the Berkeley Lab's Center for Computational Sciences and Engineering are at the forefront of a neglected corner of the scientific world, building mathematical models for fluids at the mesoscale. The little-known field of fluctuating hydrodynamics could have enormous impacts in applications ranging from batteries to drug delivery to microfluidic devices.
20-Jan-2015 Pinpointing the magnetic moments of nuclear matter
Using supercomputing resources at the National Energy Research Scientific Computing Center at Berkeley Lab, a team of nuclear physicists has demonstrated for the first time the ability of quantum chromodynamics (QCD) -- a fundamental theory in particle physics -- to calculate the magnetic structure of some of the lightest nuclei. Their findings are part of an ongoing effort to further our understanding of the universe.
16-Dec-2014 A standard for neuroscience data
In November, Neurodata without Borders hosted a hackathon to consolidate ideas for designing and implementing a standard neuroscience file format. And BrainFormat, a neuroscience data standardization framework developed at Berkeley Lab, was one of several candidates selected for further investigation. It is now a strong contender to contribute to the development of a community-wide data format and storage standard for the neuroscience research community.
The Department of Energy's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time.