“Companies that adopt an energy-first approach for AI are the future.” – Vasudha Badri Paul, CEO, Avatara AI
The times are changing, and in the age of AI those changes are often unfolding at a speed we never previously imagined. As our contemporary lives become increasingly intertwined with artificial intelligence—from search engines to business tools—a new concern is beginning to emerge: how this rapidly growing energy footprint will affect us.
A new report from data infrastructure provider TRG Datacenters offers some insights into this challenge. Encouragingly, it also suggests that some of the leading AI developers are beginning to tackle the problem by significantly improving the energy efficiency of their models.
READ: OpenAI in talks for $60 billion investment from Amazon, Nvidia and Microsoft (January 29, 2026)
TRG Datacenters CEO Chris Hinkle says “The math is simple but scary: AI demand is on track to quadruple by 2030, and our power grids just aren’t built for that speed. We’re hitting a physical wall where we can’t just build more data centers; we have to make the software stop being so ‘hungry.’”
The study examined major language models to see how companies are saving energy as the technology continues to grow. Its findings reveal a clear trend: the newest generation of AI models is becoming markedly more efficient, even as usage continues to surge. Many AI experts agree that improving the energy efficiency of AI systems is just as critical as expanding their capabilities, particularly as global demand grows exponentially.
The study found that Grok 4.1 may be leading the efficiency gains. The model reduced energy consumption by 38 percent compared to its previous version. Despite handling 134 million daily queries, Grok 4.1 lowered its power requirement from 0.55 watt-hours per query to 0.34. This also brought the average cost down from $0.000098 to $0.000061 per request—the largest improvement recorded in the study. Researchers described it as “the most energy-efficient model in the world today.”
These findings reflect a broader push in the technology industry toward what experts are calling Green AI, an approach focused on reducing the environmental impact of large-scale artificial intelligence systems.
Sridhar Verose, a council member in the city of San Ramon and a technologist with more than two decades of expertise in cloud operations and digital transformation, said the shift is becoming increasingly necessary.
“Green AI is driven by the need to reduce the rapidly growing energy demands of large-scale AI models. A multi-layered approach combines energy-efficient hardware, algorithmic efficiency, and specialized, smaller model architectures,” he said.
Read: OpenAI sees wave of senior exits following ChatGPT push
The research also shows that Google’s Gemini 3 follows closely behind. Ranked second in the study, Gemini 3 reduced energy consumption by 35 percent.
According to the report, “The model supports an estimated 850 million daily queries, yet maintains the lowest cost per request in the ranking—just $0.000043.” By reducing its power use by more than a third, Gemini 3 demonstrates that large-scale AI systems can expand rapidly while still keeping operating costs and electricity demand under control.
Several other leading AI systems also posted meaningful improvements. Claude Opus 4.5 from Anthropic reduced electricity use by 27 percent while processing around 180 million daily queries. Meanwhile, China-developed DeepSeek V3.2 improved efficiency by 25 percent while handling roughly 650 million daily queries.
The push for energy-efficient AI is becoming increasingly urgent as global demand continues to grow. Data centers already account for a rising share of electricity consumption, and the explosive growth of generative AI tools is expected to accelerate that trend.
Vasudha Badri Paul, CEO of Avatara AI, a San Francisco-based firm that helps businesses design and manage AI solutions, says the future of AI must align with climate considerations.
READ: Google adds Gemini AI to Chrome following antitrust court win (
“The need is to align computing with the future of climate by using stranded, wasted energy to power AI workloads. Companies that adopt an energy-first approach for AI are the future,” she adds.
If the research is any indication, the coming years could bring even more energy-efficient models. Efficiency gains of 30 percent or more from models such as Grok and Gemini already signal meaningful progress.
Hinkle also notes that the shift toward efficiency could be critical to sustaining the rapid growth of AI. “Seeing models like Grok or Gemini slash their energy use by 30% or more proves that we can actually make these systems smarter without just throwing more juice at them,” he says.
“When you look at GPT-5.2, saving 19% across 2.5 billion daily hits is the equivalent of powering a whole city for free. This kind of ‘efficiency-first’ mindset is the only way we keep the lights on while the AI boom continues.”


