How Tech Giants are Reshaping Corporate Climate Commitments

AI is Creating a Two-Tier System for Corporate Climate Action

Driven primarily by a massive expansion in data centers, AI energy demands could potentially reach 400 terawatt-hours by 2030, a sharp increase from just under 100 terawatt-hours (TWh) in 2020. The energy consumption of these centers is escalating rapidly due to the computationally intensive nature of AI, especially large language models (LLMs) and generative AI. According to a recent survey by McKinsey, nearly 80% of organizations surveyed already use AI in at least one business function, and generative AI use has grown from 33% in 2023 to 71% in 2024.

Alphabet, Amazon, Apple, Meta, Microsoft, Nvidia, and Tesla account for roughly one-third of the S&P 500’s total market capitalization. They each possess the financial resources and strategic leverage to secure dedicated clean energy sources for their operations regardless of the cost. Most corporations do not have this capability, therefore securing a competitive advantage for the tech elite through energy infrastructure assets.

In September 2024, Microsoft signed a 20-year power purchase agreement with Constellation Energy to restart Three Mile Island Unit 1, renamed the Crane Clean Energy Center, with the intent of providing power to Microsoft beginning in 2027. Amazon recently announced investments exceeding $500 million in small modular reactors (SMRs) and other advanced nuclear technologies as part of its strategy to power operations with carbon-free energy and meet growing demands. The investments include a $334 million investment for a multiyear feasibility study exploring the construction of a cluster of SMRs at the Hanford site in Washington State, partnering with Energy Northwest. Another key investment is in X-energy, a developer of advanced SMRs and fuel technology, as part of a $500 million fundraising round, along with an agreement with Dominion Energy to explore the development of an SMR project near its existing North Anna Power Station in Virginia.

The Computational Cost of AI: Training vs. Inference

AI increases energy demand due to its computational needs, especially for LLMs and other generative AI, which are significantly higher than those of traditional computing. This requires powerful, energy-intensive hardware housed in massive data centers, leading to a substantial increase in electricity consumption for both operations and cooling. Two main phases in the lifecycle of an AI model drive this high energy usage:

  • Training: This is the most energy-intensive part of the process. Training a large AI model involves feeding it vast amounts of data over weeks or months, which requires thousands of high-performance graphics processing units (GPUs) and other specialized processors to run continuously. The sheer scale and duration of these computations consume huge amounts of electricity.
  • Inference: This is the phase where the trained AI model is used to generate responses or perform tasks, like answering a query through a prompt interface. While a single-inference task uses a small amount of energy, the cumulative effect is significant. With millions of people using generative AI tools daily, the energy required for inference is growing rapidly and is expected to eventually exceed the energy demand of the training phase.

The Environmental and Economic Impact of AI Data Centers

The energy consumption of AI is causing a significant increase in the power demand of data centers, which are the backbone of AI technology, and their electricity use is projected to double within the next 5 years. This unprecedented demand is placing considerable strain on electricity grids, which were not built to accommodate such a rapid and massive increase in power consumption, leading to the following consequences:

  • Increased infrastructure costs: Utilities must build new power plants and transmission lines to meet this demand with costs often passed on to consumers through higher electricity rates.
  • Grid instability: The rapid growth of data centers, many of which are built faster than the grid can be upgraded, can lead to power distortions and even risk blackouts.
  • Water consumption: AI requires powerful servers that generate a lot of heat, requiring extensive cooling systems, which use large amounts of water, putting additional stress on local water supplies, particularly in regions already facing scarcity.

A Two-Tiered Climate Landscape: The Future of AI and Corporate Energy Strategy

AI’s rapid growth, driven particularly by LLMs and generative AI, is predicted to significantly increase energy demand, reaching 400 TWh by 2030 and straining grids and resources. This increase is creating a two-tiered corporate climate landscape where tech giants leverage their financial power to secure dedicated clean energy (e.g., SMRs, nuclear), thereby gaining a competitive edge. Most other companies, however, will struggle with rising energy costs and limited sustainable options, which will challenge their climate commitments. The primary focus for the broader corporate world will shift from emissions reduction to securing reliable, affordable clean energy in an AI-driven market. This shift will necessitate a reevaluation of energy policies. If your company is looking for help with its technology or clean energy strategy, please contact Canopy Edge to schedule an initial consultation.

Clint Wheelock, Managing Director

Clint Wheelock
Managing Director

Clint is a Managing Director at Canopy Edge, responsible for management of the consulting team, project execution and quality assurance, and content strategy. He has over 25 years of management consulting and market analysis experience, focused on sustainability, energy, and emerging technology sectors.