Artificial Intelligence Learns to Save Energy by AI Uses Less Energy by Thinking More Like the Human Brain
AI Business Strategy4 min read

Artificial Intelligence Learns to Save Energy by AI Uses Less Energy by Thinking More Like the Human Brain

Chen Mei

AI Strategy Consultant

November 3, 2025
Artificial Intelligence Learns to Save Energy by AI Uses Less Energy by Thinking More Like the Human Brain

Artificial Intelligence Learns to Save Energy by AI Uses Less Energy by Thinking More Like the Human Brain

Date: March 27, 2025 Source: Texas A&M University

Artificial intelligence is fast, powerful, and capable of solving problems humans never could on their own. But this performance comes at a steep price: huge amounts of electricity. Meanwhile, the human brain is also an incredibly efficient computer, yet it runs on the energy of a small light bulb. Now, engineers at Texas A&M University say they may have found a way to close that gap. Their work on a new architecture called “Super-Turing AI” mimics the brain’s ability to combine memory and learning into one process, instead of shuffling massive amounts of data back and forth like traditional AI. The team believes this could reshape the way future AI systems are built.

The Energy Problem in AI From large language models like ChatGPT to other advanced systems, today’s AI requires immense computing power. These models live in sprawling data centers that consume electricity at staggering scales. “Data centers are consuming power in gigawatts, whereas our brain consumes 20 watts,” explained Dr. Suin Yi, assistant professor of electrical and computer engineering at Texas A&M. “That’s one billion watts compared to just 20.” This growing demand doesn’t just raise the price of operating AI; it also increases its carbon footprint. As AI expands into more industries and products, finding ways to make it sustainable is becoming critical.

Inspired by the Brain The answer, Yi and his colleagues argue, lies in biology. In the human brain, learning and memory are not separate tasks. They are tightly connected through synapses — the junctions where neurons communicate. These connections change through synaptic plasticity, allowing the brain to adapt, form new pathways, and recall information. In contrast, traditional computer hardware separates training (how an AI model learns) from memory (where data is stored). This design forces constant data migration, which is both time-consuming and energy-intensive. Super-Turing AI tackles this problem by merging learning and memory in one system. Instead of depending only on backpropagation, the standard but energy-hungry training method, the researchers tested biologically inspired techniques such as Hebbian learning (“cells that fire together, wire together”) and spike-timing-dependent plasticity. These processes work much like the brain, letting neurons strengthen or weaken connections based on activity.

Testing the Idea The researchers proved the concept using a circuit that helped a drone navigate a challenging environment. Without prior training, the drone was able to adapt in real time, learning on the go. The experiment showed the system was not only faster and more efficient but also consumed less energy than conventional AI.

Why It’s Important Tech companies are racing to make AI models bigger and smarter. But scaling up comes with serious limits — both hardware capacity and energy usage. In some cases, the only option is to build entirely new data centers, which adds more cost and more environmental impact. Yi stresses that hardware innovation must keep pace with AI’s rapid software growth. “Many people think AI is just software, but without computing hardware, AI cannot exist,” he said.

The Future: Smarter, Greener AI Super-Turing AI could be a turning point for the industry. By designing systems that work more like the human brain, researchers hope to make AI both powerful and sustainable. “Modern AI like ChatGPT is incredible, but it’s too expensive,” Yi said. “We want to create sustainable AI. Super-Turing AI could completely change how AI is designed and deployed, making sure progress benefits both people and the planet.”

Journal Reference Jungmin Lee, Rahul Shenoy, Atharva Deo, Suin Yi, Dawei Gao, David Qiao, Mingjie Xu, Shiva Asapu, Zixuan Rong, Dhruva Nathan, Yong Hei, Dharma Paladugu, Jian-Guo Zheng, J. Joshua Yang, R. Stanley Williams, Qing Wu, Yong Chen. HfZrO-based synaptic resistor circuit for a Super-Turing intelligent system. Science Advances, 2025; 11 (9). DOI: 10.1126/sciadv.adr2082

Found this helpful?

Share it with your network