Breakthrough Technology Reduces AI Processing Energy Requirements by 1000 Times or More
Researchers at the University of Minnesota Twin Cities have made a groundbreaking discovery that could revolutionize the way artificial intelligence (AI) computing is done. In a recent study published in the peer-reviewed journal npj Unconventional Computing, the team unveiled a new technology that could reduce the energy consumption required for AI processing by a factor of at least a thousand.
The current practice of AI computing involves constant data transfer between processing components and memory/storage, leading to a significant energy drain. However, the researchers have developed Computational Random-Access Memory (CRAM), which places a high-density, reconfigurable spintronic in-memory compute substrate within the memory cells themselves. This innovative approach allows for processing to occur entirely within the computer’s memory array, eliminating the need for data to travel back and forth.
In tests comparing CRAM to existing near-memory processing systems, the results were staggering. The CRAM proved to be 2,500 times more energy-efficient and 1,700 times faster when performing tasks like training AI systems to recognize handwriting. This level of efficiency could have a profound impact on the energy consumption of AI workloads, which are already approaching levels comparable to an entire nation’s electricity usage.
The lead author of the study, Yang Lv, and the rest of the research team have already applied for patents based on their technology and plan to collaborate with semiconductor industry leaders to further develop and scale up their innovation. With the potential to greatly reduce energy consumption and improve processing speeds, this new technology could pave the way for more efficient and sustainable AI computing in the future.