Sidebar

53745068098 4d883ebef3 k

On Tuesday, the Royal Swedish Academy of Sciences awarded the 2024 Nobel Prize in Physics to Professor John Hopfield from Princeton University (USA) and Professor Geoffrey Hinton from the University of Toronto (Canada) for fundamental discoveries enabling machine learning tasks using artificial neural networks. According to Dr Stepas Toliautas, an associate professor at the Institute of Chemical Physics at Vilnius University (VU), the awarded scientists applied principles from the field of physics that popularised the use of artificial neural networks for analysing complex data (e.g., images); due to these discoveries, neural networks are now an essential part of artificial intelligence systems.

“Artificial neural networks are mathematical algorithms inspired by biology, specifically the structure of animal nervous systems and brains; they can be imagined as a collection of many interconnected neurons. For the network to perform its assigned task correctly, it needs to be trained – to find the most suitable network structure, i.e., the number of neurons, their connections, and the strength of these connections. The work of Professor J. Hopfield and Professor G. Hinton allowed for the effective optimisation of neural network structures, which is why they were awarded this year’s Nobel Prize in Physics,” explains Dr S. Toliautas.

“Professor J. Hopfield proposed a neural network structure where the nodes are interconnected so that the network operates like human associative memory, i.e., recognising a desired object from partial information. Moreover, he used the widely used concept of energy in physics to evaluate the set of network parameters; the better the network performs its assigned task, the lower its energy. Hopfield networks, for example, allow for the reconstruction of a partially damaged image by correcting missing details,” the expert recounts.

“The other Nobel Prize laureate, Professor G. Hinton, worked with Boltzmann machine-type neural networks and also evaluated their energy. In this case, the best network parameters are found by replicating a known annealing process from materials science and statistical physics; at high temperatures, the network structure can change quite freely, and as the temperature decreases, the parameters – and the minimum energy – settle according to the values of the provided training data. The advantage of this principle is that it allows for finding a well-functioning neural network with many parameters without searching for the optimal value for each individual element,” emphasises Dr S. Toliautas.

According to the VU scientist, Professor J. Hopfield has continued to refine the networks named after him: his research from 2016 to 2020 underpins the so-called modern Hopfield networks, which can store large amounts of information. Although Boltzmann machines are rarely used directly today, Professor G. Hinton’s work is significant in that it popularised energy-based models for the structure and training of neural networks. Networks based on this principle can also generate new information, such as creating new images from their descriptions.

“Modern large-scale neural networks are used everywhere: from chatbots like ‘ChatGPT’ to specific tools for analysing or recognising images. For instance, in medicine, neural networks assist in searching for significant patterns in magnetic resonance imaging,” states Dr S. Toliautas.

As the VU scientist asserts, automatic solutions for data analysis tasks would look very different or might not be possible at all if, around 40 years ago, Professor J. Hopfield and Professor G. Hinton had not found ways to optimise the coefficients of artificial neural networks.

The laureates will each receive a gold medal and a diploma and share a cash prize of 11 million Swedish krona (nearly 952,000 euros).

Cookies make it easier for us to provide you with our services. With the usage of our services you permit us to use cookies. More information