DOSSIER
DIGITAL LIFE
It takes 10 times more electricity for ChatGPT to respond to a prompt than for Google to carry out a standard search. Still, researchers are struggling to get a grasp on the energy implications of generative artificial intelligence both now and going forward.
Few people realize that the carbon footprint of digital technology is on par with that of the aerospace industry, accounting for between 2% and 4% of global carbon emissions. And this digital carbon footprint is expanding at a rapid pace. When it comes to power use, the approximately 11,000 data centers in operation today consume just as much energy as the entire country of France did in 2022, or around 460 TWh. Will the widespread adoption of generative AI send those figures soaring?
The new technology will clearly affect the amount of energy that's consumed worldwide, but exactly how is hard to quantify. "We need to know the total cost of generative AI systems to be able to use them as efficiently as possible," says Manuel Cubero-Castan, the project manager on Sustainable IT at EPFL.
He believes we should consider the entire life cycle of generative AI technology, from the extraction of minerals and the assembly of components—activities whose impact concerns not only energy—to the disposal of the tons of electronic waste that are generated, which often gets dumped illegally. From this perspective, the environmental ramifications of generative AI go well beyond the power and water consumption of data centers alone.
The cost of training...For now, most of the data available on digital technology power use relates only to data centers. According to the International Energy Agency (IEA), these centers (excluding data networks and cryptocurrency mining) consumed between 240 TWh and 340 TWh of power in 2022, or 1% to 1.3% of the global total. Yet even though the number of centers is growing by 4% per year, their overall power use didn't change much between 2010 and 2020, thanks to energy-efficiency improvements.
With generative AI set to be adopted on a massive scale, that will certainly change. Generative AI technology is based on large language models (LLMs) that use power in two ways. First, while they're being trained—a step that involves running terabytes of data through algorithms so that they learn to predict words and sentences in a given context. Until recently, this was the most energy-intensive step.
Second, while they're processing data in response to a prompt. Now that LLMs are being implemented on a large scale, this is the step requiring the most energy. Recent data from Meta and Google suggest that this step now accounts for 60% to 70% of the power used by generative AI systems, against 30% to 40% for training.
ChatGPT query vs. conventional Google search...A ChatGPT query consumes around 3 Wh of power, while a conventional Google search uses 0.3 Wh, according to the IEA. If all of the approximately 9 billion Google searches performed daily were switched to ChatGPT, that would increase the total power requirement by 10 TWh per year.
Goldman Sachs Research (GSR) estimates that the amount of electricity used by data centers will swell by 160% over the next five years, and that they will account for 3% to 4% of global electricity use. In addition, their carbon emissions will likely double between 2022 and 2030.
According to IEA figures, total power demand in Europe decreased for three years in a row but picked up in 2024 and should return to 2021 levels—some 2,560 TWh per year—by 2026. Nearly a third of this increase will be due to data centers. GSR estimates that the AI-related power demand at data centers will grow by approximately 200 TWh per year between 2023 and 2030. By 2028, AI should account for nearly 19% of data centers' energy consumption.
However, the rapid expansion of generative AI could wrong-foot these forecasts. Chinese company DeepSeek is already shaking things up—it introduced a generative AI program in late January that uses less energy than its US counterparts for both training algorithms and responding to prompts.
Another factor that could stem the growth in AI power demand is the limited amount of mining resources available for producing chips. Nvidia currently dominates the market for AI chips, with a 95% market share. The three million Nvidia H100 chips installed around the world used 13.8 TWh of power in 2024—the same amount as Guatemala. By 2027, Nvidia chips could burn through 85 to 134 TWh of power. But will the company be able to produce them at that scale?
No comments:
Post a Comment