The Human Brain may hold the key to Energy-efficient AI/machine Learning

The Human Brain may hold the key to Energy-efficient AI/machine Learning

According to a team of Penn State researchers, a better understanding of how astrocytes function and can be emulated in the physics of hardware devices could lead to artificial intelligence (AI) and machine learning that autonomously self-repairs and consumes far less energy than current technologies.

Astrocytes are a type of glial cell that serves as a support cell for neurons in the brain. They are named after their star shape. They are essential for memory, learning, self-repair, and synchronization in the brain.

“This project arose from recent observations in computational neuroscience,” said Abhronil Sengupta, assistant professor of electrical engineering and computer science. “There has been a lot of effort and understanding of how the brain works, and people are trying to revise the model of simplistic neuron-synapse connections.” “It turns out that the brain has a third component, astrocytes, which make up a significant portion of the brain’s cells, but their role in machine learning and neuroscience has been overlooked.”

Simultaneously, the fields of AI and machine learning are booming. Demand for AI and machine learning skills is expected to grow at a compound annual growth rate of 71 percent by 2025, according to the analytics firm Burning Glass Technologies. However, as the use of these technologies grows, AI and machine learning face a challenge: they consume a lot of energy.

“The amount of power consumed by AI and machine learning systems is an often-underestimated issue,” Sengupta said. “IBM, for example, tried to simulate a cat’s brain activity a few years ago and ended up consuming a few megawatts of power in the process. And if we just doubled this figure to simulate human brain activity on the best supercomputer available today, the power consumption would be even higher than megawatts.”

This project arose from recent observations in computational neuroscience. It turns out that the brain has a third component, astrocytes, which make up a significant portion of the brain’s cells, but their role in machine learning and neuroscience has been overlooked.

Abhronil Sengupta

All of this power consumption is due to the intricate dance of switches, semiconductors, and other mechanical and electrical processes that occur during computer processing, which increases dramatically when the processes are as complex as those required by AI and machine learning. Neuromorphic computing, which simulates brain functions, is one possible solution.

Researchers are interested in neuromorphic computing because the human brain has evolved to use far less energy than a computer for its processes, so mimicking those functions would make AI and machine learning more energy-efficient. Another brain function with neuromorphic computing potential is the brain’s ability to self-repair damaged neurons and synapses.

“Astrocytes play a critical role in the brain’s self-repair,” Sengupta said. “When we try to create these new device structures, we try to create a prototype artificial neuromorphic hardware, but these are plagued by hardware flaws. So, based on how astrocyte glial cells cause self-repair in the brain, we might be able to use concepts from computational neuroscience to possibly cause self-repair of neuromorphic hardware to repair these faults.”

Sengupta’s lab specializes in spintronic devices, which are electronic devices that process data by spinning electrons. The researchers look at the magnetic structures of the devices and how to make them neuromorphic by simulating various neural synaptic functions of the brain in the devices’ intrinsic physics.

Key to resilient energy-efficient AI/machine learning may reside in human brain

This study was published in the journal Frontiers in Neuroscience in January. As a result of that research, a study was recently published in the same journal. “We realized that astrocytes also contribute to temporal information binding when we started working on aspects of self-repair in the previous study,” Sengupta said.

Temporal information binding is the process by which the brain makes sense of relationships between separate events occurring at different times, as well as making sense of these events as a sequence, which is a key function of AI and machine learning.

“It turns out that the magnetic structures we were working with in the previous study can be synchronized together through various coupling mechanisms,” Sengupta said. “We wanted to see how these synchronized magnetic devices could mimic astrocyte-induced phase coupling, going beyond previous work on solely neuro-synaptic devices.” “We want the devices’ inherent physics to mimic the brain’s astrocyte phase coupling.”

To better understand how this could be accomplished, the researchers created neuroscience models, including astrocyte models, to determine which aspects of astrocyte function would be most relevant to their study. Theoretical models of potential spintronic devices were also developed.

“We needed to understand device physics, which required a lot of theoretical modeling of the devices,” Sengupta explained. “After that, we looked into how we could develop an end-to-end, cross-disciplinary modeling framework that included everything from neuroscience models to algorithms to device physics.”

Developing “astromorphic computing” that is both energy-efficient and fault-tolerant could pave the way for more sophisticated AI and machine learning work to be done on power-constrained devices like smartphones.

“Every day, AI and machine learning are revolutionizing the world around us,” Sengupta said. “You can see it in your smartphones recognizing pictures of your friends and family, to machine learning’s huge impact on medical diagnosis for various diseases.” “At the same time, research into the types of self-repair and synchronization functions that astrocytes can enable in neuromorphic computing is still in its early stages. With these kinds of components, there are a lot of possibilities.”

The National Science Foundation funded this research through its Early Concept Grant for Exploratory Research program, which is designed specifically for interdisciplinary high-risk, high-payoff projects with transformative potential.