Every task we perform on our computer — whether number crunching, watching a video, or typing out an article — requires different components of the machine to interact with one another. "Communication is massively crucial for any computation," says former SFI Graduate Fellow Abhishek Yadav, a Ph.D. scholar at the University of New Mexico. But scientists don't fully grasp how much energy computational devices spend on communication.
Over the last decade, SFI Professor David Wolpert has spearheaded research to unravel the principles underlying the thermodynamic costs of computation. Wolpert notes that determining the "thermodynamic bounds on the cost of communication" is an overlooked but critical issue in the field, as it applies not only to computers but also to communication systems across the board. "They are everything that holds up modern society," he says.
Now, a new study in Physical Review Research, co-authored by Yadav and Wolpert, sheds light on the unavoidable heat dissipation that occurs when information is transmitted across a system, challenging an earlier view that, in principle, communication incurs no energetic cost. For the study, the researchers drew on and combined principles from computer science, communication theory, and stochastic thermodynamics, a branch of statistical physics that deals with real-world out-of-equilibrium systems that includes smartphones and laptops.
Using a logical abstraction of generic communication channels, the researchers determined the minimum amount of heat a system must dissipate to transmit one unit of information. This abstraction could apply to any communication channel — artificial (e.g., optical cable) or biological (e.g., a neuron firing a signal in the brain). Real-world communication channels always have some noise that can interfere with the information transmission, and the framework developed by Yadav and Wolpert shows that the minimum heat dissipation is at least equal to the amount of useful information — technically called mutual information — that filters through the channel's noise.
Then, they used another broadly applicable abstraction of how modern-day computers perform computations to derive the minimum thermodynamic costs associated with encoding and decoding. Encoding and decoding steps ensure reliable transmission of messages by mitigating channel noise. Here, the researchers gained a significant insight: improving the accuracy of data transmission through better encoding and decoding algorithms comes at the cost of increased heat dissipation within the system.
Uncovering the unavoidable energy costs of sending information through communication channels could help build energy-efficient systems. Yadav reckons that the von Neumann architecture used in current computers presents significant energetic costs associated with communication between the CPU and memory. "The principles that we are outlining can be used to draw inspiration for future computer architecture," he says.
As these energy costs apply to all communicaxtion channels, the work presents a potential avenue for researchers to deepen the understanding of various energy-hungry complex systems where communication is crucial, from biological neurons to artificial logical circuits. Despite burning 20% of the body's calorie budget, the brain uses energy far more efficiently than artificial computers do, says Yadav. "So it would be interesting to see how natural computational systems like the brain are coping with the cost associated with communication."
Read the paper "Minimal thermodynamic cost of communication" in Physical Review Research (December 22, 2025). DOI: 10.1103/qvc2-32xr
Journal
Physical Review Research
Method of Research
Computational simulation/modeling
Subject of Research
Not applicable
Article Title
Minimal thermodynamic cost of communication
Article Publication Date
22-Dec-2025