U.S.Department of Energy Research News
Text-Only | Privacy Policy | Site Map  
Search Releases and Features  
Biological SciencesComputational SciencesEnergy SciencesEnvironmental SciencesPhysical SciencesEngineering and TechnologyNational Security Science

Home
Labs
Multimedia Resources
News Releases
Feature Stories
Library
Contacts
RSS Feed



US Department of Energy National Science Bowl


Back to EurekAlert! A Service of the American Association for the Advancement of Science

 

Data-intensive computing key to predictive science



Data-intensive computing advances scientific discovery to understand the fundamentals of complex systems and provides insight into systems biology.

The ability to protect the nation from terrorist attacks, discover the hidden secrets of genes and monitor and control the electrical power grid requires the ability to process and analyze massive amounts of data and information in real time.

"The power to make breakthroughs and solve complex problems lies in our ability to successfully manage the increase in data, extract valuable knowledge from the multiple and massive data sets, and reduce the data for understanding and timely decision making," said Deborah Gracio, deputy director of Computational Sciences and Mathematics.

Gracio leads the Data- Intensive Computing Initiative (DICI) at Pacific Northwest National Laboratory. The four-year initiative is aimed at creating a computing infrastructure that will integrate data-intensive computational tools with domain science problems such as national security, biology, environment, and energy, to facilitate the next frontier—predictive science. According to Gracio, the computing infrastructure will enable predictive systems that aid scientists in the development of predictors or means for understanding the precursors to an event. "They can start to identify the biomarkers in the environment that could cause contamination or be able to observe a pattern in the way terrorists interact, opening the possibility to change the outcome."

Staff scientist Ian Gorton, a recent recruit from Australia (see "Meet" below), is the chief architect for creating the computing infrastructure. Gorton, whose goal is to develop a robust, flexible integrated system architecture encompassing both hardware and software, calls the project Medici, alluding to the Florentine architects of the Italian Renaissance and playing on DICI.

"The focus of Medici is the construction of software tools, or the underlying plumbing, that will allow applications to be plugged together so that scientists and application developers can create complex, data-intensive applications," Gorton said. "Our primary aim is to create technologies that provide scientists the ability to create various applications on a single underlying architecture. And, once created, these applications will run fast and reliably, and they’ll be able to adapt in certain ways to changes in their environment while they’re actually executing."

Gorton has worked for nearly two decades in the software architecture research world. "The types of applications I tend to build always involve many distributed computers and databases. They’re incredibly difficult to build for various technical reasons, so it’s always been a fascination of mine to try and build and use technology to make integrating all these different types of systems easier." Gorton’s team had the opportunity to demonstrate the Medici technology at Supercomputing 06. "Using our very first version of Medici, we plugged together a set of network sensors and analytical tools that were developed by various researchers at the Laboratory for cyber security purposes," he said. "And it all worked beautifully."

###

 

Text-Only | Privacy Policy | Site Map