An explanation of how ‘intelligence’ emerges from the statistical physics of networks.
BY DEBDUTTA PAUL
Much has been discussed about the 2024 Nobel Prize in Physics, especially how physics is or is not connected to the techniques that make artificial intelligence possible. This commentary weighs in on the physics of networks and emergence.
The key idea is that intelligence can be stored in a network of zeroes and ones — or bits. Akin to how magnetism of materials emerges from electrons’ intrinsic properties because of their collective interaction, John Hopfield showed that interacting binary information can collectively lead machines to learn. For Hopfield, learning implied recognising patterns in images.
Information can mean different things. For example, an image of a dog is a string of bits. But, a string of bits can also represent complete gibberish — the string may not convey any meaning collectively. Hopfield’s work showed that meaningful information can be stored, processed, and reproduced by a network of bits by following natural processes. In particular, the process that minimises the total energy of the network achieves this goal.
To do this, Hopfield mimicked the processes of the brain, where neurons go on or off depending on the information being processed. The collective activity or ‘firing’ of the neurons reproduces this information. In this language, when a baby learns to identify a dog’s image, a network of neurons fires in a particular pattern. Since a different set of neurons firing would not reproduce this goal, the baby’s learning to identify the image boils down to tweaking the collective firing to identify the dog.
Hopfield showed that computers can mimic the learning process following purely physical processes.
“Two concepts are very important. One is that neurons fire, and two that there are many interconnections, called synapses. This model incorporates both of them,” said Professor Chandan Dasgupta of ICTS-TIFR.
Emergence
Meaningful information emerges from the collective physical state of the neurons and synapses. The network’s dynamics, following interactions between the neurons, thus learn what to do with the information. “When you start with some partial knowledge, the network retrieves the memories by the collective dynamics of the network,” said Chandan.
Later, Geoffrey Hinton and his colleagues expanded the idea to design networks for performing other tasks such as learning an unknown probability distribution from the data. They introduced what physicists call ‘Boltzmann machines’ and their variants. His work introduced the concept of uncertainty and modelled the network’s neurons via a statistical approach, using techniques well-known in statistical physics. Hinton also introduced more complex interactions in these networks, demonstrating they were more efficient.

Although the computation boils down to doing many calculations over and over again, there is a key difference with usual computations. Earlier, computer programs would fine-tune a set of predetermined rules. But now, it started correcting the properties of the network to change the way it reacts to information, keeping in mind its goals. The dynamics of the network implement its goals, that is, it learns. We associate this general notion of adapting to external information with machine learning, made possible by the laureates’ work.
“It is a new way of computing, because the computing is done by the whole network, in the sense that you are not giving it instructions,” said Chandan. “It is the collective dynamics of the network which is doing the computation by itself.”
More fundamental physics?
We can speculate further on the meanings of spins, networks, and emergence in fundamental physics. Some physicists, like Roger Penrose, have suggested ‘spin networks’ as a building block of the universe. The idea is that spacetime, the arena on which the laws of physics play out, is not fundamental. Instead, it emerges from entities of the universe and their mutual interactions.

Theoretical physicists like Lee Smolin have used similar ideas to propose mathematical theories of the universe. The underlying idea is similar: space and time emerge, like jigsaw puzzles, from fundamental entities that build the universe.
Although the ideas are quite different in their mathematical descriptions, similarities exist in how networks of fundamental building blocks can give rise to the universe. In summary, the emergence of patterns from a network of simple bits may hold the key to the universe’s mysteries.
The author thanks Professor Chandan Dasgupta for discussions.
