TECH

Data centres’ insatiable demand for electricity will change the entire energy sector
When the first large language models were unleashed, it triggered a headache for authorities around the world as they tried to figure out how to satisfy data centres’ endless demand for electricity.
AI models are out on an energy-intensive training session with no end in sight. The training takes place on the servers in the world’s data centres, which currently number just over 10,000. Especially the large language models and generative AI that creates images and videos consume huge amounts of electricity.
They are so voracious that the International Energy Agency (IEA) has estimated that the power they needed for computing increased a billion-fold from 2022 to 2024.
The entire global energy sector is now changing because the demand for electricity to run and cool servers is so high.
Control in just a few hands...“What we see happening now in the development of artificial intelligence is truly extraordinary and perhaps a pivotal point in human history,” said Sebastien Gros.
He is a professor at NTNU’s Department of Engineering Cybernetics and head of the Norwegian Centre on AI-Decisions (AID). This is one of Norway’s six new centres for research on AI.
Gros notes that developments in the AI universe are driven by just a few technology companies. The scale, electricity consumption, investments and pace are formidable, with control concentrated in the hands of just a few private enterprises.
Large, undisclosed figures...It’s not now possible to find out exactly how much electricity AI providers use. “The companies that supply electricity to the data centres do not disclose these figures. The AI providers are commercial operators whose aim is to make money, and they have little interest in sharing this type of information. I don’t think anyone knows exactly, but the figures are clearly astronomical. Truly astronomical,” said Gros
Each keystroke consumes electricity...Regardless of whether it has become your office assistant, meal planner, or psychologist: every one of your keystrokes consumes electricity.
Altman has said that the polite but completely unnecessary ‘thank you’ or ‘please’ costs the company tens of millions of dollars in electricity each year.
Currently, data centres use approximately 400 terawatt-hours, or one and a half per cent of the global electricity consumption. The IEA estimates that this will double in the next five years, reaching a level comparable to the entire electricity consumption of Japan.
Americans are noticing higher electricity prices...Some of the data centres built for companies like Microsoft, Google, Amazon and Meta consume more electricity than large cities like Pittsburgh and New Orleans. Already heavily burdened power grids are under even more strain, and Americans are starting to notice that their electricity bills are rising.
The world’s largest data centre to date is being currently built in Jamnagar, India. American Nvidia, which has become the world’s most valuable company by selling chips for AI development, is heavily involved. According to the IEA, the centre could end up using as much electricity as the 10 million people living in the area.
Making difficult decisions with AI...Of all the aspects of AI development, commercial language models have received the most attention so far. However, in the shadow of these popular models, entirely different, efficient tools are being developed.
They run locally, use far less electricity, and can help us with entirely different tasks, such as detecting diseases faster, optimizing the power grid and perhaps even tackling the climate crisis.
NTNU and SINTEF researchers at the AID centre, are working to integrate AI more closely with industrial players and government authorities.
The goal is to develop tools that can manage risk and make decisions in challenging situations. These might be as varied as determining when it is safe to discharge a patient from the hospital, or ensuring stable electricity supply and production that is optimally adapted to consumption.
AI for smarter energy use...“Energy for AI, and AI for Energy,” said Fatih Birol, Executive Director of the International Energy Agency, when presenting the Energy & AI report in April 2025.
So, what was his point? That AI can also be part of the solution. According to the report, if AI is used to operate power grids more efficiently, we might be able to save up to 175 gigawatts of transmission capacity.
To put this into perspective, that could cover the electricity needs of 175 cities the size of Oslo for one year.
Uncertain development in Norway...The Norwegian Water Resources and Energy Directorate (NVE) have analyzed how the country’s energy market will change and grow (in Norwegian). They estimate that AI and data centres will account for two per cent of electricity consumption in Norway by 2050.
“I’m not exactly blown away by this number. The trends in Norway may not be as dramatic as we thought,” said Magnus Korpås, an energy systems expert and professor at NTNU’s Department of Electric Energy.
“Based on NVE’s figures, it looks like computing power will consume similar amounts of electricity as transport and other major electricity consumers, which will increase in the years to come. And in any case, two per cent is very little compared to the 15 per cent used by electric panel heaters to warm our homes,” Korpås said.
Export of computing power is a political choice...Much remains uncertain, however. How things will look in 2050 partly depends on the development of data centres. Currently, Norway has registered a little over 70 of these centres.
“NVE considers two per cent to be a realistic level that developers can reach and that Norway is able to handle. Worldwide, however, the demand for power is inexhaustible. Norway may be very well suited to becoming a major exporter of clean computing power. But then again, no one has actually suggested this,” Korpås said.
The Big Question...Korpås doesn’t think that making Norwegian electricity available for a moderate number of data centres poses a dilemma. “But the big question is whether we should make the power system and Norway’s natural environment available for the inexhaustible global consumption of AI. Whether we want to become the world’s hub for computer power is a political question,” he said.
He adds that this could very quickly be the case if we do not establish regulations.
“Establishing a centre here is attractive. Electricity is cheap, we have a cool climate, and the market is endless.”
Economically, politically and environmentally sensible? Google is developing a centre in Skien (link in Norwegian). Green Mountain and TikTok (links in Norwegian) have established themselves in Hamar. In Arendal, Bifrost Edge AS wants to build (link in Norwegian) a centre that will use as much electricity as just over 100,000 households per year. This summer, Kjell Inge Røkke and Aker launched the major Stargate Norway project in Narvik (link in Norwegian) in collaboration with OpenAI.
Professor Sebastien Gros sees rational arguments for developing data centres in Norway, especially in the north.
“Financially, it is very rational to build in Narvik, an area with large electricity surpluses and low prices. Politically, it also makes sense, because it provides the country with revenue. And environmentally, clean Norwegian hydropower is better than coal power in California or China. We need to consider the advantages and disadvantages and look at the bigger picture,” said Gros.
NTNU Norwegianscitechnews
No comments:
Post a Comment