AI's Carbon Footprint:
The True Cost to the Planet
(And Why It's Getting Better)

October 8, 2025

By Nicholas Johnson, Founder of Ataviz Consulting

When I recently gave a presentation on artificial intelligence (AI), I had a great question that I hadn’t fully prepared for:

What about the environmental impact of AI? Is using AI more hurting our planet?

It’s a fair and important question. After digging into it, I want to share what I learned. The story isn’t just one of cost, there’s promise. The efficiency gains of AI are improving, and many signs point toward those gains soon outweighing the downsides.

Two Sides of the Coin: Training vs. Inference

To understand AI’s environmental impact well, it helps to distinguish between two phases:

  • Training: This is when a model is being built. Big datasets, a lot of computation, many high-powered machines running for hours, days, or even weeks.
  • Inference (or everyday usage): The phase when users ask questions, generate images or text, run predictions, etc. It’s the “on-demand” usage of the model after it’s been trained.

Here’s how they compare:

Feature Training Inference (Usage)
Energy & resource spike Very high, but usually one-time per model version Lower per interaction, but cumulative over many interactions
Frequency Less frequent (though with model improvements may retrain) Very frequent, millions to billions of requests daily
Environmental impact “shape” Big upfront cost, cooling, hardware efficiency matters More spread out; depends heavily on infrastructure, efficiency, energy mix

Examples with data:

  • Training large models like GPT-3 required huge amounts of electricity, reported estimates of over 1,200 megawatt-hours for some similar LLMs. (ADaSci)
  • But inference, over time, can actually account for the majority of energy usage across a model’s lifetime if it's used heavily. Studies show that in many deployments, inference’s cumulative footprint surpasses training’s. (arXiv)

So: yes, training is expensive. But when you use a model many times (or millions of times), inference adds up, often more than people realize.

Is Using AI Hurting the Environment? And Is It Getting Better?

It would be easy to frame this as “using AI = bad for the planet.” But that’s not the full truth. Several trends show that the environmental cost is being addressed more aggressively, and usage-efficiency is improving.

Here are things improving and reasons for optimism:

  1. Renewable Energy & Greener Data Centers
    More data centers are being built to run on clean energy.
    For example:
    • Meta has signed agreements to source hundreds of megawatts of solar and wind power to support its data centers.
    • Verne, in the Nordic countries, operates data centers powered by 100% renewable energy (geothermal, hydro, wind/etc.), with designs that leverage natural cooling (climate-aided cooling) reducing energy for air conditioning.
    • Partnerships like ENGIE + Prometheus are building “AI-ready” data centers in Texas that pair renewable energy / battery storage with efficient cooling designs.
  2. Efficiency Improvements in Training & Inference
    Technology and methods are evolving to reduce energy costs:
    • Research shows that techniques such as sparse training, adaptive inference, and better hardware utilization (e.g. better GPU/TPU design & scheduling) can cut energy usage significantly. One paper reported ~35% reduction in training energy via sparse training, and ~20-25% in inference when using adaptive methods. (IJCA Online)
    • Data-centric approaches, i.e. using better curated datasets, pruning features, reducing unnecessary complexity, can also drastically reduce training energy, sometimes with minimal loss of accuracy. One study found energy savings up to 92% when modifying datasets. (arXiv)
  3. Scale & Spread of Usage Makes Per-Interaction Cost Lower
    As models are used more, the “cost per query” tends to fall, assuming the infrastructure is efficient. Improvements in hardware, software optimizations, caching, model distillation (making smaller versions of big models) all help.

  4. Regulatory & Market Pressure
    Regulators, customers, investors increasingly look at environmental impact. Companies are publishing sustainability goals, energy transparency, and committing to “net zero.” This creates positive feedback: providers need to make their infrastructure more efficient to stay credible.

What the Future Looks Like Environmentally

When you project forward, several things seem likely, and many are already in early stages:

  • Smaller, More Efficient Models: Not everything needs a super-large model. For many use cases, smaller or distilled models will be “good enough” and far cheaper energy-wise.
  • Inference Optimized for Efficiency: Model architectures and deployment strategies that trade off very small accuracy gains for large savings in energy will become more common. People (and businesses) will prefer models with “good enough” output but much lower environmental cost.
  • Carbon / Energy Aware AI: Systems that schedule heavy compute when renewable energy is abundant, or route tasks to data centers in regions with cleaner grids. Also, data centers will get smarter about cooling, idle power use, etc.
  • More Renewable Infrastructure: More sites located near renewable sources, more battery storage, more cooling solutions that reduce water use, better PUE (Power Usage Effectiveness) in data centers.
  • Standards & Transparency: Benchmarks or certifications for “eco-AI”, measurement tools for inference energy, disclosure of carbon footprints per model and per provider. Studies like How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference point toward the need for models to be compared on not just accuracy and speed, but also energy and environmental cost. (arXiv)

What It Means for the Everyday AI User

You don’t need to be building LLMs to make choices that help tilt the scale toward better environmental outcomes. Here’s what novice or light users, folks using AI tools for writing, image generation, content, etc., can take away:

  • Don’t Sweat Every Prompt, But Be Mindful
    One query or image isn’t going to wreck the planet. What matters is aggregate usage. If you use AI often, being a bit more intentional helps.
  • Prefer Tools / Providers with Clear Sustainability Reports
    When choosing an AI or cloud provider, see if they publish environmental metrics, use renewables, or have green infrastructure. That signals that your usage will be “greener” by default.
  • Use AI Features Wisely
    For example, use shorter prompts, reuse templates, avoid unnecessarily long outputs if you don’t need them; batch tasks when possible. These small optimizations can reduce inference costs.
  • Support / Adopt Efficient Models
    If there are “light version” models, offline models, or “distilled” versions, consider using those when possible. Sometimes they do “enough” with much less cost.
  • Encourage Transparency and Accountability
    As users, asking for carbon estimates, or how tools are powered, helps drive demand. If companies see people care, they’ll invest more.

Bottom Line

  • Yes — there is environmental impact in both training and using AI. Training is energy-intensive; inference, because it is continuous, adds up over time.
  • But here’s the hopeful part: the trajectory is positive. Advances in infrastructure, renewable energy, model design, and efficiency methods are reducing the marginal environmental cost of using AI.
  • We’re approaching (or will approach) a tipping point where the efficiency and value AI brings: faster decisions, better tools, and optimizations across many industries, will outweigh the remaining environmental costs, especially as providers and users grow more environmentally aware.

So the story isn’t “AI bad, stop using it.”

The story is “AI is evolving, and its environmental footprint is shrinking, and that gives us a responsibility (and opportunity) to use it well, choose providers consciously, and push for more sustainable models.”

-- Your Hidden CTO


Stay in the loop

Get updates straight to your inbox.