The Hidden Cost of AI: Massive Energy Consumption in Advanced Computing

AI’s voracious appetite for electricity is not only substantial, but expanding rapidly as the technology advances.
The Hidden Cost of AI: Massive Energy Consumption in Advanced Computing
A general view in the CERN Computer / Data Centre and server farm of the 1450 m2 main room during a behind-the-scenes tour at CERN, the World's Largest Particle Physics Laboratory in Meyrin, Switzerland, on April 19, 2017. Dean Mouhtaropoulos/Getty Images
Shawn Lin
Sean Tseng
Updated:
0:00
As artificial intelligence continues to evolve, its impact on society grows more profound. Yet, beyond the innovations, there exists a less discussed aspect: AI’s voracious appetite for electricity. This demand for power is not just substantial, it’s expanding rapidly as AI technologies, like those from OpenAI, advance.

Energy Consumption of AI Training

Training a sophisticated AI model requires a combination of big data, advanced computing power, and robust algorithms. Consider OpenAI’s GPT-3, a model with 175 billion parameters. It was trained using 1,024 graphics processing units (GPUs) running non-stop for a bit over a month.
Mosharaf Chowdhury, an associate professor of electrical engineering and computer science at the University of Michigan, estimates that the training session of GPT-3 consumed about 1,287 megawatt-hours (MWh) of electricity. To put this into perspective, that’s equivalent to the energy consumption of an average American household over 120 years.
Since the launch of GPT-3 four years ago, the scale and complexity of these models have grown significantly. The subsequent model, GPT-4, launched in 2023, boasts 1.76 trillion parameters—10 times more than GPT-3, as well as 25,000 GPUs, over 24 times that of its predecessor. It is estimated that the energy consumed to train GPT-4 ranged between 51,773 and 62,319 MWh, an increase of over 40 times compared to GPT-3. Remarkably, this energy could sustain an average American household for about 5,000 years, or alternatively, it could meet the energy needs of 1,000 average U.S. households for approximately 5 to 6 years.

The upcoming GPT-5 promises even greater capabilities and, consequently, even higher energy demands.

In this photo illustration, the welcome screen for the OpenAI "ChatGPT" app is displayed on a laptop screen on February 03, 2023, in London, England. (Leon Neal/Getty Images)
In this photo illustration, the welcome screen for the OpenAI "ChatGPT" app is displayed on a laptop screen on February 03, 2023, in London, England. Leon Neal/Getty Images

Long-term Energy Use in AI Applications

While training an AI model is a finite process, its application is ongoing. As AI becomes more integral to our daily lives and the user base expands, the cumulative electricity consumption becomes significant. According to the International Energy Agency (IEA), a single response from ChatGPT consumes an average of 2.9 watt-hours—about the energy needed to light a 60-watt bulb for three minutes. This rate is nearly 10 times that of an average Google search.

With approximately 200 million daily interactions, ChatGPT’s daily electricity consumption totals over 500,000 kilowatt-hours—roughly equivalent to the daily electricity needs of 17,000 American households. Annually, this amounts to 182.5 million kilowatt-hours.

If Google were to integrate generative AI into its search functions widely, the additional electricity required could be as high as 10 terawatt-hours (TWh) per year, the IEA estimates (pdf). To put this into perspective, 10 TWh could cool 5 million homes, light over 10 million homes, or fully power 700,000 homes for an entire year.

Rising Energy Demands of Data Centers: The Backbone of AI

Data centers, the critical infrastructure supporting AI, are more than just warehouses filled with servers. They are dynamic, high-powered environments that provide the essential computational power, storage, and network bandwidth needed for AI applications to function seamlessly and scale effectively.

One of the lesser-known yet significant aspects of data centers is their cooling systems. To prevent overheating, these centers require robust cooling solutions to manage the heat generated by thousands of servers operating around the clock. Consequently, the energy consumption of data centers is substantial.

In 2022, data centers worldwide consumed approximately 460 TWh of energy, according to the IEA report (pdf). This figure represents nearly 2 percent of the global electricity demand. Notably, this consumption is not driven solely by AI operations; nearly a quarter, or about 110 TWh, was attributed to cryptocurrency mining activities.

The IEA report details that the primary sources of energy consumption within data centers are computing and cooling, each accounting for 40 percent of the total usage. The remaining 20 percent is consumed by other related IT equipment.

Looking ahead, the IEA anticipates a significant surge in demand. By 2026, it is projected that data centers could consume up to 1,000 TWh of electricity annually—roughly the current yearly electricity consumption of Japan. This increase would be comparable to the annual energy usage of countries ranging in size from Sweden to Germany.

An employee of the German Climate Computing Center (DKRZ, or Deutsches Klimarechenzentrum) next to the “Mistral” supercomputer, installed in 2016, at the German Climate Computing Center on June 7, 2017, in Hamburg, Germany. (Morris MacMatzen/Getty Images)
An employee of the German Climate Computing Center (DKRZ, or Deutsches Klimarechenzentrum) next to the “Mistral” supercomputer, installed in 2016, at the German Climate Computing Center on June 7, 2017, in Hamburg, Germany. Morris MacMatzen/Getty Images

The advent of artificial intelligence is not just reshaping technology but also fundamentally altering the infrastructure that supports it. As such, data centers are undergoing rapid expansion and modernization to keep pace with technological advancements, leading to a corresponding increase in energy consumption.

In 2015, the global count of data centers stood at approximately 3,600. This number nearly doubled to 8,000 by 2021 and surged to 10,978 by the end of 2023. The proliferation of these facilities is expected to continue.
The United States leads this expansion with 5,388 data centers. John Ketchum, CEO of NextEra Energy Inc., the world’s largest privately owned developer of wind and solar energy, forecasts a 40 percent increase in U.S. electricity demand over the next two decades, starkly contrasting the 9 percent increase observed in the previous 20 years, according to Bloomberg. Mr. Ketchum attributes this dramatic rise primarily to data centers driven by the demands of AI.

‘Sovereign AI’ Movement and Global AI Dominance

As AI becomes increasingly integral to economic development, national security, and international competition, especially between powerhouses like the United States and China, nations are embracing the “Sovereign AI” movement. This initiative involves countries across Asia, the Middle East, Europe, and the Americas investing in their own national AI computing facilities to ensure they have control over their technological futures.

At the World Government Summit held in Dubai in February, Nvidia founder and CEO Jensen Huang emphasized the necessity for each country to develop its own intelligence products.

The quest for AI supremacy may hinge on which countries can sustain the escalating demands for data centers and the electricity they consume. This dynamic suggests that the energy needs driven by AI and its supporting infrastructures are set to grow, potentially without limit.

Nvidia CEO Jensen Huang delivers his keystone speech ahead of Computex 2024 in Taipei on June 2, 2024. Computex is the top annual tech showcase in Taiwan, whose advanced semiconductor industry is crucial to the production of everything from iPhones to the servers that run ChatGPT. (Sam Yeh/AFP via Getty Images)
Nvidia CEO Jensen Huang delivers his keystone speech ahead of Computex 2024 in Taipei on June 2, 2024. Computex is the top annual tech showcase in Taiwan, whose advanced semiconductor industry is crucial to the production of everything from iPhones to the servers that run ChatGPT. Sam Yeh/AFP via Getty Images

Strategies for Sustainable AI Development

As the energy demands of data centers skyrocket, major cloud service providers like Amazon, Microsoft, and Google are committing to significant environmental goals. Each has pledged to power their data centers entirely with renewable energy, exploring innovative technological methods to minimize electricity consumption and better balance demand on the electricity grid.

These companies’ efforts include enhancing the efficiency of chips and servers, while reducing cooling requirements, which is one of the largest energy drains in data center operations.

At the forefront of the conversation on sustainable AI is the recognition that current energy solutions may not suffice as AI technologies evolve and their energy needs expand. Sam Altman, CEO of OpenAI, underscored this point at the World Economic Forum Annual Meeting in Davos in January 2024.

He stated that the future scalability of AI technology is contingent on breakthroughs in energy technology, particularly because the projected energy consumption of advancing AI technologies will likely surpass current estimates. This challenge has spurred increased investments in alternative energy sources, notably nuclear fusion.

Highlighting this shift toward innovative energy solutions, TerraPower—a company initiated by Microsoft co-founder Bill Gates—recently began the construction of a next-generation nuclear power plant in Wyoming. This plant is unique in its use of sodium instead of water for cooling, which Mr. Gates believes will revolutionize electricity generation. During the groundbreaking ceremony, he described the site as poised to become “the bedrock of America’s energy future.”