How Brain Research is Advancing Artificial Intelligence

Symbolbild zum Artikel. Der Link öffnet das Bild in einer großen Anzeige.
Medicine doctor touching electronic medical record on tablet. DNA. Digital healthcare and network connection on hologram modern virtual screen interface, medical technology and futuristic concept.

Research Team Paves the Way for More Efficient AI

Artificial Intelligence (AI), especially the training of AI systems like ChatGPT, consumes vast amounts of energy. If AI could operate more like human brains, it would be significantly more efficient. Dr. Achim Schilling and Dr. Patrick Krauss from the Neuroscientific Laboratory of the Ear, Nose, and Throat Clinic – Head and Neck Surgery (Director: Prof. Dr. Dr. h. c. Heinrich Iro) at the University Hospital Erlangen, together with their colleagues Dr. Richard Gerum from Canada and Dr. André Erpenbeck from the USA, have discovered a method to modify certain artificial nerve cells so that they behave more like neurons in the brain. The work of the researchers from the University Hospital Erlangen and Friedrich-Alexander University Erlangen-Nuremberg (FAU) aims to support the development of artificial intelligence systems that require fewer resources such as energy or computer processing power. Their study has been awarded the best publication among more than 1,800 submitted and over 1,000 accepted papers at the world’s largest conference on neural networks.

Conventional AI systems are constructed from units roughly inspired by the design of neurons. However, they use continuous numerical values in their work, while natural nerve cells use binary electrical impulses known as spikes for information processing. This makes the human brain’s activity much more efficient because information is encoded not in the strength of these impulses but in their temporal intervals.

AI Requires Multiple Times More Energy Than the Brain

Spikes are millisecond-long, consistently high voltage impulses, with information lying in the time between the occurrence of spikes. In contrast, AI systems multiply very large matrices of real numbers in their operations, with information embedded in the exact values, i.e., the activations of artificial neurons. This consumes vast amounts of energy. For comparison, the brain requires 20 watts for information processing – the amount of energy used by a light bulb. Simple graphics processors for AI applications, on the other hand, already consume several hundred watts.

Improving AI systems also requires a substantial amount of energy and hardware resources. This is because the systems are primarily trained by increasing the amount of data, such as text collections on the internet. The number of trainable parameters is also continually increased. Dr. Schilling and Dr. Krauss, who work at the intersection of AI and brain research, focused on a specific type of artificial nerve cells in their work. These LSTM (Long Short-Term Memory) units can “remember” previous experiences and can be made to forget unnecessary information through so-called gates. The researchers modified the LSTM units to behave like brain neurons that use spikes for information transmission and processing. They utilized the properties of LSTM neurons to mimic the membrane potential – the voltage – of biological cells, allowing the input signal from other neurons to be summed.

Tested on Images – Highly Promising Results

The modified LSTM units were tested by the researchers on four image datasets used to train AI systems. They aimed to determine if their LSTM units performed as well as existing artificial neurons. The result: the novel LSTM units achieved similarly good results. The plan is to apply these algorithms to more complex data, such as language and music.

What makes the researchers’ work special is that their approach combines the benefits of AI and brain research. Their findings could pave the way for developing AI systems that operate more like the human brain, solving complex tasks rapidly while consuming fewer resources.

About the Best Paper Award

The significant contribution of Schilling, Krauss, and their international collaborators was recognized with the “Best Paper Award” at the International Joint Conference on Neural Networks 2023. This accolade came among a competition of more than 1,000 other published works. The conference is the world’s largest gathering for artificial neural networks, hosting a distinguished audience. It serves as the most prestigious platform for the exchange of ideas among researchers from various fields of artificial neural networks, neuroinformatics, and neurotechnology.

DOI: 10.1109/IJCNN54540.2023.10191268

More information:

Dr. Achim Schilling
achim.schilling(at)uk-erlangen.de

Quelle: uni | mediendienst | forschung Nr. 44/2023

View original post