The Rise of In-Memory Computing Chips in China, Due to the AI Energy Crisis

Lily Polanco Follow Apr 04, 2024 · 1 min read
The Rise of In-Memory Computing Chips in China, Due to the AI Energy Crisis
Share this

A recent article from Bloomberg highlights how the growing energy demands of artificial intelligence are boosting interest in a technology called in-memory computing. As AI requires massive amounts of data processing that consumes vast amounts of electricity, startups and major chipmakers are exploring new “in-memory” chip designs that could make AI more energy efficient.

According to the Bloomberg article, traditional AI chips require huge amounts of data shuttling between separate memory and processor chips. This is inefficient as it requires transporting “the equivalent of billions of books…back and forth” which burns through electricity. In-memory computing aims to process data within the memory chips themselves, analogous to “going to the library to do your work” rather than transporting books elsewhere as Philip Wong, a Stanford professor and Taiwan Semiconductor Manufacturing consultant, explained to Bloomberg.

While technically challenging, in-memory computing is now moving beyond research as its energy saving potential for AI applications gains attention. As the Bloomberg article outlines, major chipmakers like Intel, TSMC and Samsung are researching in-memory designs. Startups focused on this technology have also raised over $160 million from investors including Microsoft, Saudi Aramco’s venture arm, and Chinese investors.

However, the article notes many challenges remain. In-memory chips must demonstrate reliability over temperature fluctuations. Convincing customers to adopt new chip types also requires proven improvements. While not targeting the most complex AI training work yet, startups like d-Matrix and Mythic aim to enable more efficient inference using domain-specific in-memory chips.

As the article concludes, AI’s vast energy demands and potential environmental impacts are driving urgency for more efficient solutions. In-memory computing’s promise of processing data where it resides could fulfill this need. If startups can overcome technical hurdles, this technology may play a meaningful role in the future of sustainable, large-scale artificial intelligence.

Written by Lily Polanco Follow
Junior News Writer @