AI Governance8 min readBy James Okafor

Quick Answer

The environmental footprint of enterprise AI — energy consumption, carbon emissions, water usage — and practical strategies for building sustainable AI programs.

AI and Sustainability: Reducing Environmental Impact

Enterprise AI programs are under increasing scrutiny for their environmental impact. A single training run for a large language model can consume as much energy as hundreds of transatlantic flights. Ongoing inference at scale consumes significant electricity and water for cooling. As ESG commitments become more rigorous, AI sustainability cannot be an afterthought.


The AI Environmental Footprint

Training vs Inference

Training: The one-time process of building an AI model. Extremely energy-intensive. Training GPT-4 was estimated to require hundreds of megawatt-hours of electricity. Training large models produces hundreds of tons of CO2.

Inference: Running the trained model for each query. Much less energy per query than training, but queries happen billions of times — cumulative inference energy can exceed training energy over a model's lifetime.

Most enterprise organizations don't train foundation models — they use models trained by AI providers. The training footprint is the provider's. The inference footprint is shared.


Water Consumption

Data centers use enormous amounts of water for cooling. A single large AI query can require hundreds of milliliters of water for cooling. At scale, this is significant — particularly relevant for data centers in water-stressed regions.


The Efficiency Trajectory

The good news: AI efficiency is improving faster than usage is growing (currently). Each generation of AI hardware delivers more compute per watt. Model optimization techniques (quantization, distillation, sparse models) reduce energy per query. This means the energy intensity of AI is falling even as absolute usage grows.


Assessing Your AI Carbon Footprint

For enterprise AI programs, the relevant footprint includes:

Scope 1+2 (Direct + Electricity):

  • On-premise GPU infrastructure energy consumption
  • Cloud AI inference energy consumption (allocated from cloud provider)

Scope 3 (Value chain):

  • Hardware manufacturing emissions for AI compute
  • Emissions from AI providers' operations (through your contract)

Most enterprises rely primarily on cloud AI inference. Cloud providers increasingly offer carbon impact reporting for AI workloads.


Strategies for Sustainable AI

Strategy 1: Choose AI providers with strong renewable energy commitments

The single highest-impact decision for enterprise AI sustainability. Cloud providers with 100% renewable energy commitments (or high renewable content) dramatically reduce the Scope 2 footprint of inference.

  • Google Cloud: Carbon-neutral since 2007; targeting 24/7 carbon-free energy by 2030
  • Microsoft Azure: Carbon-neutral; targeting carbon-negative by 2030
  • AWS: Committed to 100% renewable energy by 2025 (now achieved for most regions)

Select AI providers and cloud regions with the highest renewable energy content.


Strategy 2: Right-size models for tasks

Using a 175-billion parameter model for a task that a 7-billion parameter model can handle equally well wastes energy by 25x or more.

Implement model routing: Classify request complexity and route to the smallest capable model. This reduces energy per query while maintaining quality.

Use efficient models: Many tasks (classification, extraction, summarization) can be handled by smaller, highly efficient models. Reserve large models for genuinely complex reasoning.


Strategy 3: Reduce unnecessary inference

Every AI query has an energy cost. Design applications to avoid unnecessary queries:

  • Caching: Semantic caching reduces redundant inference by 20-50% for typical enterprise applications
  • Batching: Processing multiple items in one call vs individual calls
  • Early exit: For classification tasks, simple cases exit early without full model processing

Strategy 4: Optimize prompts for length

Input tokens cost energy to process. Shorter prompts consume less energy:

  • Remove unnecessary system prompt boilerplate
  • Use more concise instructions
  • Eliminate padding and repetition

Strategy 5: Select efficient hardware

For on-premise AI deployments:

  • NVIDIA H100/H200 provide significantly better performance per watt than previous generations
  • AMD Instinct MI300 series offers competitive energy efficiency
  • Custom AI accelerators (Google TPUs, AWS Trainium) are optimized for specific workload types

Measurement and Reporting

For ESG reporting, enterprises need to track:

Energy consumption: Kilowatt-hours consumed by AI workloads (available from cloud provider dashboards).

Carbon emissions: CO2 equivalent from AI energy use (convert using regional grid intensity or provider-provided carbon factors).

Inference efficiency: Energy per successful query — tracks improvement over time.

Model efficiency metrics: Parameters per task type, energy per token generated.


Balancing Sustainability and Capability

There are genuine tradeoffs between AI capability and sustainability:

  • More capable models use more energy per query
  • Higher accuracy often requires more compute
  • Faster responses may require more efficient (but still significant) resources

The right balance involves:

  1. Never using more compute than the task requires (right-sizing)
  2. Using renewable energy for the compute you do need
  3. Continuously improving efficiency as better models and hardware become available

This is not a reason to avoid AI investment — AI can create significant sustainability benefits (optimizing energy grids, reducing waste, improving logistics efficiency) that far exceed its direct environmental costs when applied thoughtfully.


Conclusion

AI sustainability is not an optional consideration for enterprises with ESG commitments. The energy and water footprint of AI programs is real, measurable, and increasingly scrutinized. Organizations that build sustainability into their AI programs from the start — choosing efficient providers, right-sizing models, and tracking impact — will be better positioned for increasing ESG reporting requirements and stakeholder scrutiny.


Related Reading

Ready to deploy autonomous AI agents?

Our engineers are available to discuss your specific requirements.

Book a Consultation