Showing stories 1-25 out of 114 stories. 1 | 2 | 3 | 4 | 5>>>
3-Nov-2016 SLAC, Berkeley Lab researchers prepare for scientific computing on the exascale
Researchers at the Department of Energy's SLAC National Accelerator Laboratory are playing key roles in two recently funded computing projects with the goal of developing cutting-edge scientific applications for future exascale supercomputers that can perform at least a billion billion computing operations per second -- 50 to 100 times more than the most powerful supercomputers in the world today.
2-Nov-2016 Sandia to evaluate if computational neuroscientists are on track
The Intelligence Advanced Research Projects Activity (IARPA) launched the Machine Intelligence from Cortical Networks (MICrONS) project earlier this year. Sandia National Laboratories is refereeing the work of three university-led teams to map, understand and mathematically re-create visual processing in the brain to close the computer-human gap in object recognition.
27-Oct-2016 PPPL physicists win funding to lead a DOE exascale computing project
A proposal from PPPL scientists has been chosen as part of a national initiative to develop the next generation of supercomputers. Known as the Exascale Computing Project, the initiative will include a focus on exascale-related software, applications, and workforce training.
5-Aug-2016 Researchers combine simulation, experiment for nanoscale 3-D printing
A research team led by Oak Ridge National Laboratory has created a high-power simulation and design process to print free-standing 3-D structures on the nanoscale using focused electron beam induced deposition. The simulation-guided nanomanufacturing method allows researchers to design and construct complex high-fidelity nanostructures with less guesswork.
Alex Aiken, director of the new Computer Science Division at the Department of Energy's SLAC National Accelerator Laboratory, has been thinking a great deal about the coming challenges of exascale computing, defined as a billion billion calculations per second. That's a thousand times faster than any computer today. Reaching this milestone is such a big challenge that it's expected to take until the mid-2020s and require entirely new approaches to programming, data management and analysis, and numerous other aspects of computing.
27-May-2016 ORNL researchers use strain to engineer first high-performance, two-way oxide catalyst
Catalysts make chemical reactions more likely to occur. In most cases, a catalyst that's good at driving chemical reactions in one direction is bad at driving reactions in the opposite direction. However, a research team led by the Department of Energy's Oak Ridge National Laboratory has created the first high-performance, two-way oxide catalyst and filed a patent application for the invention. The accomplishment is reported in the Journal of the American Chemical Society.
19-May-2016 Berkeley Lab's OpenMSI licensed to ImaBiotech
Two years ago, Lawrence Berkeley National Laboratory (Berkeley Lab) researchers developed OpenMSI--the most advanced computational tool for analyzing and visualizing mass spectrometry imaging (MSI) data. Last year, this web-available tool was selected as one of the 100 most technologically significant new products of the year by R&D Magazine. Now, OpenMSI has been licensed to support ImaBiotech's Multimaging™ technology in the field of pharmaceutical and cosmetic research and development.
14-Mar-2016 Brookhaven Lab named an NVIDIA GPU Research Center
NVIDIA, the world leader in visual computing, has named the US Department of Energy's Brookhaven National Laboratory a 2016 GPU Research Center. Brookhaven Lab was recognized for its use of graphics processing unit (GPU)-accelerated computing to conduct research in fields including materials science, physics, and climate science, and for its vision to further the application of GPU-accelerated computing in those and other research fields with a high computational demand.
9-Mar-2016 ORNL's benchmark data set validates global nuclear reactor codes
A re-analysis of nuclear fuel rods from a commercial reactor used improved radiochemical methods developed at Oak Ridge National Laboratory and characterized more than 50 different isotopes and 16 elements with high accuracy. It produced an experimental data set with uncertainties many times smaller than those obtained by the earlier radiochemical analysis. Modeling and simulation experts at ORNL applied the more accurate experimental data to validate codes widely used by the nuclear safeguards research community.
22-Feb-2016 Updated workflows for new LHC era
After a massive upgrade, the Large Hadron Collider (LHC) is smashing particles at an unprecedented 13 teraelectronvolts (TeV) -- nearly double the energy of its previous run. In just one second, the LHC can now produce up to 1 billion collisions and generate up to 10 gigabytes of data.
To deal with the new data deluge, researchers working on the LHC's ATLAS experiment are relying on updated workflow management tools developed primarily by Berkeley Lab researchers.
21-Jan-2016 Explore galaxies far, far away at internet speeds
Scientists have released an 'expansion pack' for a virtual tour of the universe that you can enjoy from the comfort of your own computer. The latest version of the publicly accessible images of the sky, which can be viewed using an interactive Sky Viewer tool, roughly doubles the size of the searchable universe from the project's original release in May.
21-Jan-2016 Facility staff and DOE computer scientists collaborate to speed up experimental data analysis
In early December, the US Department of Energy's (DOE) Brookhaven National Laboratory hosted the first in a series of week-long 'hackathons,' a code brainstorming session attended by nearly 40 computer scientists and software developers from several DOE Office of Science User Facilities, including those at Argonne, Berkeley, Oak Ridge and SLAC national laboratories.
5-Oct-2015 DOE announces funding for new center for computational materials sciences at Brookhaven Lab
The US Department of Energy (DOE) has announced $12 million in funding over the next four years for a new Center for Computational Design of Functional Strongly Correlated Materials and Theoretical Spectroscopy at Brookhaven National Laboratory and Rutgers University. Center scientists will develop next-generation methods and software to accurately describe electronic properties in complex strongly correlated materials, as well as a companion database to predict targeted properties with energy-related application to thermoelectric materials.
29-Sep-2015 Titan helps unpuzzle decades-old plutonium perplexities
First produced in 1940, plutonium is one of the most electronically complicated elements on Earth -- and because of its complexities, scientists have been struggling to prove the existence of its magnetic properties ever since. Finally, that struggle is over, thanks to a timely combination of theory, algorithm and code developments, neutrons experiments, and Titan -- the second-most-powerful supercomputer in the world.
20-Aug-2015 Carbon number crunching
A booming economy and population led China to emerge in 2006 as the global leader in fossil-fuel carbon emissions, a distinction it still maintains. But exactly how much carbon China releases has been a topic of debate, with recent estimates varying by as much as 15 percent.
The Department of Energy's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time.