Machine learning pioneers win the Nobel Prize for Physics

Machine learning pioneers win the Nobel Prize for Physics
Two researchers who Boom in artificial intelligence (KI) are based on the Nobel Prize for Physics 2024.
John Hopfield from Princeton University in New Jersey and Geoffrey Hinton from the University of Toronto, Canada, share the price of 11 million Swedish crowns ($ 1 million), which was announced on October 8 by the Royal Swedish Academy of Sciences in Stockholm.
Both physical tools used to develop methods that Drive artificial neural networks that use brain-inspired, useful structures to learn abstract concepts. Her discoveries "form the building blocks of mechanical learning that can help people make faster and more reliable decisions," said Ellen Moons, chair of the Nobel Committee and physicist at the University of Karlstad, Sweden. "Artificial neural networks were used to promote research in various physical topics, from particle physics to material sciences to astrophysics."
in 1982 Hopfield, a theoretical biologist with background in physics, developed a network that described the connections between nodes as physical forces 1 . By storing patterns as a low energy state of the network, the system was able to restore the image when it was confronted with a similar pattern. It became known as an associative memory because it resembles the brain that tries to remember a rarely used word or concept.
Hinton, a computer scientist, later used principles from the statistical physics that are used for the collective description of systems that consist of too many individual parts to further develop the "Hopfield networks". By integrating probabilities into a layered version of the network, he created a tool that was able to recognize and classify images or to generate new examples of the type on which it was trained 2 .
These processes differed from the previous calculations, since the networks were able to learn from examples, including unstructured data, which is a challenge for conventional software based on step -by -step calculations.
The networks are "generously idealized models that are as different from real biological neuronal networks as apples from planets", wrote Hinton In 2000 in Nature . But they have proven to be useful and were extensively developed. Neural networks that imitate human learning form the basis of many highly developed AI tools, from large voice models (LLMS) to machine learning algorithms that are able to analyze large amounts of data, including Protein structure forecasting model Alphafold .
In a telephone conversation at the announcement, Hinton said that it was "a flash out of the blue" when he learned about his Nobel Prize. "I'm amazed, I had no idea that it would happen," he said. He added that progress in mechanical learning will “have an enormous influence; it will be comparable to the industrial revolution. But instead of exceeding people in physical strength, it will exceed the people in intellectual ability.”
-
Hopfield, J.J., Proc. Natl. Acad. Sci. USA 79, 2554 (1982).
-
Fahlman, S.E., Hinton, G.E. and Sejnowski, T.J. Proceedings of the AAAI 83 Conference, pp. 109-113 (1983).