Brain-inspired algorithms can significantly reduce AI energy use


One of the major issues facing artificial intelligence is the interaction between a computer’s memory and its processing capabilities. When the algorithm is running, data flows quickly between these two components. However, AI models rely on a huge amount of data, which creates a bottleneck.

A New studyThis research, published Monday in the journal Frontiers in Science from Purdue University and the Georgia Institute of Technology, proposes a new approach to building a computational architecture for artificial intelligence models using brain-inspired algorithms. Creating algorithms this way could reduce the energy costs associated with AI models, researchers say.

“The size of language processing models has increased 5,000-fold over the past four years,” said Kaushik Roy, a professor of computer engineering at Purdue University and lead author of the study. In a statement. “This alarmingly rapid expansion makes it imperative that AI be as efficient as possible. This means fundamentally rethinking how computers are designed.”


Don’t miss any of our unbiased technical content and lab reviews. Add CNET As Google’s preferred source. Don’t miss any of our unbiased technical content and lab reviews. Add CNET As Google’s preferred source.


Most computers today are designed around an idea dating back to 1945 called the von Neumann architecture, which separates processing and memory. This is where the slowdown occurs. As more people around the world use data-hungry AI models, the distinction between computer processing and memory capacity could become a more important issue.

Researchers at IBM have called out this problem In a job Earlier this year. The problem that computer engineers face is called the “memory wall.”

Break the memory wall

the Memory wall It refers to the disparity between memory and processing capabilities. Basically, your computer’s memory struggles to keep up with processing speeds. This is not a new issue. A pair of researchers from the University of Virginia coined this term Back in the 1990s.

Atlas of Artificial Intelligence

CNET

But now with the spread of AI, the memory wall problem is eating up time and energy in the underlying computers that make the AI ​​models work. In this paper, the researchers argue that we can experiment with a new computer architecture that integrates memory and processing.

Inspired by how our brains work, the artificial intelligence algorithms referred to in the paper are known as Spiking neural networks. A common criticism of these algorithms in the past is that they can be slow and inaccurate. However, some computer scientists think so These algorithms have Showed significant improvement Over the past few years.

The researchers suggest that AI models should use a concept related to social networking (SNN), known as In-memory calculation. This concept is still relatively new in the field of artificial intelligence.

“CIM offers a promising solution to the memory wall problem by integrating computing capabilities directly into the memory system,” the authors wrote in the paper’s abstract.

Medical devices, transportation and drones are some of the few areas where researchers believe improvements could be made if computer processing and memory were combined into a single system.

“AI is one of the most transformative technologies of the 21st century. However, to move it from data centers to the real world, we need to significantly reduce its energy use,” Tanvi Sharma, co-author and researcher at Purdue University, said in a statement.

“With less data transfer and more efficient processing, AI could fit into small, affordable devices with longer-lasting batteries,” Sharma said.



Leave a Reply

Your email address will not be published. Required fields are marked *