For The First Time, Google Reveals How Much Energy An AI Prompt Uses


Google has revealed that each text prompt given to its Gemini AI apps uses about 0.24 watt-hours of electricity — roughly the same as watching TV for less than nine seconds.
As concerns grow about AI’s environmental impact, Google released a detailed technical report via blog post on Thursday, Aug. 21, 2025, estimating the energy use, water consumption and carbon emissions per prompt.
The report — which is the most transparent disclosure yet from a leading tech company — offers a long-awaited insight into AI’s energy use, detailing the amount of power consumed by AI chips and their supporting infrastructure.
“We wanted to be quite comprehensive in all the things we included,” Jeff Dean, Google’s chief scientist, told the MIT Technology Review.
The MIT Technology Review reports that Google’s custom AI chips take up 58% of the .24 watt-hours of electricity. The remaining energy goes to supporting equipment: CPUs and memory use 25%, backup systems 10% and data center overhead — like cooling and power conversion — adds another 8%.
Google’s report also revealed a sharp drop in energy use per Gemini text prompt. In May 2025, the median prompt’s energy and carbon footprint dropped by 33x and 44x respectively from May 2024. The company credits model advancements and software optimizations for the improvement.
“People are using [AI tools] for all kinds of things, and they shouldn’t have major concerns about the energy usage or the water usage of Gemini models, because in our actual measurements, what we were able to show was that it’s actually equivalent to things you do without even thinking about it on a daily basis,” Dean said, per MIT, “like watching a few seconds of TV or consuming five drops of water.”
Despite its transparency, Google’s report leaves key gaps. It doesn’t reveal the total number of Gemini queries, making it hard to estimate overall energy use.
The reported figure reflects only the median energy for text prompts — some queries use much more, and tasks like generating images or videos, which weren’t included, require significantly more energy.
Still, Google says the goal was to give users a clearer view of AI’s energy impact.
“Gemini’s efficiency gains are the result of years of work, but this is just the beginning,” Google said in its blog post.
“Recognizing that AI demand is growing, we’re heavily investing in reducing the power provisioning costs and water required per prompt. By sharing our findings and methodology, we aim to drive industry-wide progress toward more efficient AI. This is essential for responsible AI development,” the post concluded.




