šŸŒ Your team thinks AI is killing the planet. They're wrong.

Billions of queries, real data: AI’s true energy cost

Hi, and happy Tuesday.

Let me guess what happened in your last meeting about AI.

Someone raised their hand: ā€œBut what about the environmental impact?ā€

Valid concern. Responsible question.

But here’s the problem: most people are working with last year’s assumptions.

The myth we all believed

For years, the narrative was clear:

  • Every ChatGPT query chews through server farms.

  • Every image generated melts polar ice caps.

  • Headlines screamed about AI’s ā€œinsatiableā€ energy appetite.

Much of that was based on training costs or early inference estimates - not real production data.

What actually happens today

Google recently published the first large-scale, measured analysis of AI inference: billions of Gemini text prompts, in live production. Not models in a lab:

Here’s what they found for a median text prompt:

  • Energy: 0.24 Wh

  • Carbon: 0.03 g COā‚‚e

  • Water: 0.26 mL (ā‰ˆ five drops)

In CEO terms: one prompt = the energy of watching TV for 9 seconds.

Extrapolated based on yearly data provided in https://arxiv.org/pdf/2508.15734

And here’s the kicker: in the past 12 months alone —

  • Energy per query fell 33Ɨ

  • Emissions per query fell 44Ɨ

  • Water per query fell 19Ɨ

That’s not incremental. That’s exponential.

Objections your team will raise, and why they don’t stick

  1. ā€œBut that’s just text, not images or video.ā€
    True - image/video prompts cost more. But the same efficiency curve applies, and most enterprise AI use cases today are text-based.

  2. ā€œBillions of queries still add up.ā€
    Also true. But per-query efficiency is falling faster than usage is growing. Net environmental impact is lower today than a year ago.

  3. ā€œGoogle’s using market-based carbon accounting.ā€
    They are - but even with location-based grids, per-query numbers are still orders of magnitude below the old estimates.

  4. ā€œWhat about embodied emissions (chips, data centers)?ā€
    Important point. But those are fixed costs. The more queries per chip, the lower the per-query share - and efficiency gains mean fewer chips are needed overall.

  5. ā€œIsn’t this just Google PR?ā€
    Skepticism is healthy. But this is the first dataset based on real production inference at global scale. Previous studies were projections. With how much a hot topic this is, we can expect more studies from other players.

Reality check: Compared to what?

The danger isn’t that AI consumes 0.24 Wh per query.

It’s that we ignore the inefficiencies AI can replace. Without AI, we may have had to visit a doctor or expert in person to get the same information, absorb hours of research from watching documentaries or even manually perform 47 browser tabs of research.

  • A single in-person meeting trip = more carbon than millions of queries.

  • One hour of TV in the average U.S. household = energy of 400 AI queries.

  • Keeping 47 browser tabs open (more energy than 100 AI queries/hour)

But isn’t total demand still exploding?

Yes, 0.24 Wh is small, but at Google scale billions of queries could still add up to gigawatt-hours. Isn’t that still environmentally significant?

Yes, but again - compared to what previous activity? The manual, legacy processes often wastes more energy than the AI inference that could replace them.

If it gets too cheap, people will just use more of it. That rebound effect could wipe out the sustainability gains -  it’s called the Jevons Paradox. But three things matter here:

(1) Historical precedent: Every major technology (lighting, transport, computing) faced rebound effects. Yet efficiency gains were still the main driver of long-term decarbonization.

(2) Counterfactuals: Even with higher volumes, AI often replaces far less efficient processes (travel, legacy IT, human time). Eliminating those wastes creates a net environmental gain.

(3) Governance: Policy frameworks and corporate ESG targets act as guardrails, ensuring that efficiency improvements don’t spiral into unchecked consumption.

The question that matters

It’s not ā€œDoes AI use energy?ā€ (Everything does.)
It’s not even ā€œIs AI getting more efficient?ā€ (It is - by orders of magnitude).

The real question:

What inefficiencies can AI eliminate that waste 100Ɨ, 1,000Ɨ, even 100,000Ɨ more energy than AI ever consumes?

Your team’s environmental concerns are admirable. But channel them toward the right enemy: trillions of watt-hours lost to outdated systems AI can replace.

Reducing inefficiencies is exactly what we’ll dive into in a few complimentary AI Masterclasses I’m considering hosting, based on my book, Do More With Less: The AI Playbook for Amplifying Talent & Output. These sessions are designed to help teams spot their first high-impact use cases and deliver quick wins with the tools they already have (e.g., Copilot and Excel).

If you’d like to join, or know someone who should, here’s the form to let us know: https://docs.google.com/forms/d/e/1FAIpQLSf8ZbGGY1ZdaiUxxCOTTS3CIFmkO9f-oSzU0cfYLjO_cQn7dw/viewform 

Next time someone brings up AI’s footprint in a meeting, ask: ā€œCompared to what?ā€ 

Best,

Dino

CEO, PreScouter & Auxee

Helping enterprise teams turn AI into shipped outcomes

Catch up on our recent videos:

Dino Gane-Palmer shares three ways AI is reshaping work: stress-testing ideas, cleaning contracts, and scaling into agent fleets that hint at zero-person firms. With insights from Conagra’s Stacey Popham and Kellogg’s Prof Mohan Sawhney, this video blends future vision with practical use cases.

10min 16sec

Well crafted inputs and refinement of the outputs turn writing into gold,  within a fraction of the time of making it from scratch.

4min 52sec

Even Greg Brockman, OpenAI President and co-founder, is shocked that ChatGPT actually works. Why? Because ChatGPT is essentially an advanced form of autocomplete.

5min 44sec