Scientists have begun turning to new tools offered by machine learning to help save time and money. In the past several years, nuclear physics has seen a flurry of machine learning projects come online, with many papers published on the subject. Now, 18 authors from 11 institutions summarize this explosion of artificial intelligence-aided work in “Machine Learning in Nuclear Physics,” a paper recently published in Reviews of Modern Physics. The paper is also available on arXiv.
“It was important to document the work that has been done. We really do want to raise the profile of the use of machine learning in nuclear physics to help people see the breadth of the activities,” said Amber Boehnlein, lead author of the paper and the associate director for computational science and technology at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility.
Because the paper gathers and summarizes major work in the field thus far, Boehnlein hopes it can act as an educational resource for interested readers, as well as a roadmap for future endeavors.
“It provides a benchmark that people can use as they go forward into the next phase,” she said.
A machine learning revolution
After attending a workshop exploring artificial intelligence at Jefferson Lab in March 2020 and publishing a follow-up report, Boehnlein and two of her co-authors, Witold Nazarewicz and Michelle Kuchera, were inspired to go a step further. Together with 15 colleagues representing all subfields of nuclear physics, they decided to conduct a survey of the state of machine learning projects in nuclear physics.
They started at the beginning. As the authors describe, the first significant work employing machine learning in nuclear physics used computer experiments to study nuclear properties, such as atomic masses, in 1992. Although this work hinted at machine learning’s potential, its use in the field remained minimal for more than two decades. In the last several years, that changed.
Machine learning, which involves building models that can perform tasks without explicit instruction, requires computers to do specific things, including complicated calculations. With recent advances, computers can better meet these demands, which has allowed physicists to more readily incorporate machine learning into their work.
“This would have been a less interesting paper in 2019, because there wouldn’t have been enough work to catalog. But now, there is significant work to cite due to the increased use of the techniques,” Boehnlein said
Today, machine learning spans all scales and energy ranges of research, from investigations of matter’s building blocks to inquiries into the life cycles of stars. It is also found across the four subfields of nuclear physics: theory, experiment, accelerator science and operations, and data science.
“We made an effort to compile a comprehensive, collective resource that bridges the efforts in our subfields, which will hopefully spark rich discussions and innovation across nuclear physics,” said co-author Kuchera, who is an associate professor of physics and computer science at Davidson College.
Machine learning models can be used to help both the design and execution of experiments in nuclear physics. They can also be used to aid in the analysis of those experiments’ data, of which there is often in excess of petabytes.
“I expect machine learning to become embedded into our data collection and analysis,” Kuchera said.
Machine learning will speed up these processes, which could mean less time and money is needed for beamtime, computer usage, and other experimental costs.
Connecting theory and experiment
So far, however, machine learning has developed the strongest foothold in nuclear theory. Nazarewicz, who is a nuclear theorist and chief scientist at the Facility for Rare Isotope Beams at Michigan State University, is especially interested in this subject. He says that machine learning can help theorists do advanced calculations faster, improve and simplify models, make predictions, and help theorists understand the uncertainties of their predictions. It can also be used to study phenomena that researchers cannot conduct experiments on, such as supernova explosions or neutron stars.
“Neutron stars are not very user friendly,” said Nazarewicz.
He uses machine learning to study hyperheavy nuclei and elements, which have so many protons and neutrons in their nuclei that they can’t be observed experimentally.
“I find the results to be the most impressive in the theory community, particularly the low-energy theory community that Witold is associated with,” Boehnlein said. “They seem to be really embracing these techniques.”
Boehnlein said theorists have also started to embrace these techniques at Jefferson Lab in their study of proton and neutron structures. Specifically, machine learning can help extract information from complicated theories, such as quantum chromodynamics, the theory that describes the interactions between the quarks and gluons that make up protons and neutrons.
The authors predict that machine learning’s involvement in both theory and experiment will speed up these subfields independently, and it will also better interconnect them to speed up the entire loop of the scientific process.
“Nuclear physics helps us make discoveries to better understand the nature of our universe, and it’s also used for societal applications,” said Nazarewicz. “The faster we can do the cycle between experiment and theory, the faster we will arrive at discoveries and applications.”
As machine learning continues to grow in this field, the authors expect to see more developments and broader applications incorporating this tool.
“I think we're only in the infancy of the application of machine learning to nuclear physics,” Boehnlein said.
And, along the way, this paper will act as a reference, even for its own authors.
“I hope the paper is used as a resource to understand the current state of machine learning research, allowing us to build from these efforts,” Kuchera said. “My research is centered on machine learning methods, so I absolutely will utilize this paper as a window into the state of machine learning across nuclear physics right now.”
Journal Article: Machine Learning in Nuclear Physics
By Chris Patrick
Jefferson Science Associates, LLC, manages and operates the Thomas Jefferson National Accelerator Facility, or Jefferson Lab, for the U.S. Department of Energy's Office of Science.
Michigan State University operates the Facility for Rare Isotope Beams as a user facility for the U.S. Department of Energy Office of Science, supporting the mission of the Office of Nuclear Physics.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.