Scale Is All You Need? Part 3-2

Note: If you haven’t seen Part 3-1, you can read it here.

Energy

US power demand likely to experience growth
not seen in a generation.

Goldman Sachs

After analyzing the exponentially increasing demand for compute, which is largely driven by the increasing complexity and size of modern AI models, we are now turning our attention to one of the fundamental requirements for these developments: power. Without sufficient energy supply, even the most powerful data centers and GPUs would not be able to do their work.

But as the models continue to grow in size, energy requirements are also growing at a breathtaking pace. Each additional GPU used to train models like GPT, Llama or DALL-E consumes not only more computing power but also more electricity This leads us to a crucial bottleneck on the road to AGI: the availability and sustainability of energy. While companies like NVIDIA are working on developing more efficient chips (e.g. NVIDIA's B200 series is said to be significantly more energy efficient than the H100 series), the energy question remains unresolved: power consumption is increasing exponentially, and the efficiency gains of modern hardware can only cushion this development to a limited extent.

A glance at the infrastructure already shows that the demand for energy for AI applications has reached new dimensions. Microsoft's purchase of 5 gigawatts from the Three Mile Island nuclear power plant is a clear indicator of the amount of electricity needed to operate AI systems today. Sam Altman of OpenAI also secures access to enormous amounts of energy to ensure the future development of his AI models. 

But what happens when energy demand continues to rise? Can renewable energies or nuclear power plants keep pace with this demand? And what geopolitical implications will arise if access to electricity resources becomes the new question of power? Answering these questions is essential, because one thing is clear: without sufficient energy supply, progress in AI development and the achievement of AGI will be significantly slowed down or even stopped.

In the following, we therefore examine the current and future electricity demand of AI developments. We analyze how much energy is needed to train and operate modern models, what the forecasts are for future consumption, and whether the current infrastructure is capable of meeting this demand. One thing is clear: the path to AGI will be decisively influenced not only by compute, but also by the energy question. We then turn to the question of access and power in the context of social inequality over compute and electricity.

Data centers currently consume around 3% of the world's electricity. This is expected to rise to 6% by 2030. A study by Goldman Sachs clearly demonstrates the increasing demand.

“Data centers and their associated transmission networks have become a primary driver of global energy consumption. At present, this accounts for 3% of global consumption, emitting as much CO2 as Brazil. Increasing energy requirements show no sign of slowing down either, as consumption could grow from 460 terawatt-hours in 2022 to 1000 twh in 2026. In the United States alone, the increase in power demand due to data center demand is expected to rise from 200 twh as of 2022 to 260 twh in 2026, equivalent to 6% of all power use across the country. Now, data centers’ energy demands are expected to double by 2030.”

“Driven by AI, broader demand and a deceleration in the pace of energy efficiency gains, global data center power demand is poised to more than double by 2030 after being flattish in 2015-20. (...) We estimate about 47 GW of incremental power generation capacity will be required to support US data center power demand growth cumulatively through 2030, met with about 60% gas and 40% renewable sources. We expect this to drive about $50 bn of capital investment in US power generation capacity cumulatively through 2030.”

“AI’s energy use currently only represents a fraction of the technology sector’s power consumption, which is estimated to be around 2-3% of total global emissions. This is likely to change as more companies, governments and organizations use AI to drive efficiency and productivity. Data centres are already significant drivers of electricity demand growth in many regions, as this chart shows.

But it could be even worse, because our electricity demand is not developing linearly, but exponentially. If, for example, the number of parameters in a neural network is doubled, the energy requirement increases not just by double, but by a multiple, since the computing time for processing these additional parameters is considerably longer. This kind of scaling effect is particularly evident in the largest AI models.

Big Tech is spending tens of billions quarterly on AI accelerators, which has led to an exponential increase in power consumption. Over the past few months, multiple forecasts and data points reveal soaring data center electricity demand, and surging power consumption. The rise of generative AI and surging GPU shipments is causing data centers to scale from tens of thousands to 100,000-plus accelerators, shifting the emphasis to power as a mission-critical problem to solve.

Demand is particularly challenging in the US because of the high level of investment in data center construction there.

“Arm’s executives also see data center demand rising significantly: CEO Rene Haas said that without improvements in efficiency, "by the end of the decade, AI data centers could consume as much as 20% to 25% of U.S. power requirements. Today that’s probably 4% or less." CMO Ami Badani reiterated Haas’ view that that data centers could account for 25% of US power consumption by 2030 based on surging demand for AI chatbots and AI training.”

“The data-center builder Schneider Electric’s 2023 white paper The AI Disruption: Challenges and Guidance for Data Center Design estimated that AI workloads accounted for 8% of the estimated 54 GW of electricity used by data centers last year. Schneider expects that share to rise to 15–20% by 2028, when data-center demand is forecast to reach over 90 GW. Of total AI usage, 20% is currently used for the training of models, with the rest going to inference—the individual instances with which the model is tasked. The ratio is expected to evolve to 15:85 by 2028.”

And the new generation of GPUs in particular is set to increase demand even further.

“Nvidia’s upcoming Blackwell generation boosts power consumption even further, with the B200 consuming up to 1,200W, and the GB200 (which combines two B200 GPUs and one Grace CPU) expected to consume 2,700W. This represents up to a 300% increase in power consumption across one generation of GPUs with AI systems increasing power consumption at a higher rate. SXM allows the GPUs to operate beyond the PCIe bus restrictions, offer higher memory bandwidth, high data throughput and higher speeds for maximal HPC and AI performance, thus drawing more power.”

 

Although this is not the place to start a discussion about CO2 emissions, it should be clear that the question of electricity production is an important one. Globally, efforts are now being made to focus primarily on renewable energies, but global CO2 emissions are still rising – and of course the training of new models has an impact on this.

A paper from the University of Massachusetts Amherst stated that “training a single AI model can emit as much carbon as five cars in their lifetimes.” Yet, this analysis pertained to only one training run. When the model is improved by training repeatedly, the energy use will be vastly greater. 

Consequently, it is not surprising that former Google CEO Eric Schmidt says that the climate targets are unattainable, especially in the context of AGI. 

In addition, requests to an LLM consume significantly more power than a regular Google search. Routers that select the right model (an SLM or an LLM) depending on the question could help to reduce demand a little, at least temporarily. Nevertheless, the more AI searches replace regular Google searches (e.g. Perplexity.ai), the more electricity demand increases here as well.

“Since the introduction over the past year and a half of ChatGPT and other so-called large language models (LLMs), such as Microsoft’s Copilot and Google’s Bard, some researchers have issued—and news outlets have reported—predictions of skyrocketing electricity demand. One such forecaster was Alex de Vries, a data analyst and PhD candidate at the Free University of Amsterdam School of Business and Economics. He says that global electricity demand for AI could grow larger than the entire consumption of Argentina by 2027. “With AI, the whole principle is that bigger is better,” de Vries says. “Bigger models are more robust and perform better. But they require more computational resources and more power.

But even if you can generate the electricity, it also has to be supplied to the data centers. Often, the power lines are not designed for such a load. And that, in turn, poses major challenges for the power grid.

“Further, lengthy interconnection queues remain a challenge to connecting new projects to the grid, and expediting the permitting/approval process for transmission projects will be key to alleviate it.”

We see that, in addition to computing, electricity demand poses the greatest challenge for us Exponentially increasing demand, which collides with the demand for climate protection and an infrastructure (power lines) that are not designed for this high load. Although attempts are being made on all sides to counter this by connecting old nuclear power plants to the grid and expanding renewable energies, new GPUs and larger models will have a higher long-term demand than electricity can be supplied. A solution must therefore be found in the medium term. Because the data situation is clear.

Part 3-3 is coming out tomorrow. Subscribe to the Forward Future Newsletter to have it delivered straight to your inbox.

About the author

Kim Isenberg

Kim studied sociology and law at a university in Germany and has been impressed by technology in general for many years. Since the breakthrough of OpenAI's ChatGPT, Kim has been trying to scientifically examine the influence of artificial intelligence on our society.

Sources_ Scale Is All You Need_ Part 3-2 .pdf32.54 KB • PDF File

Reply

or to participate.