Signature Sponsor
AI and Energy: Will AI Help Reduce Emissions or Increase Demand? Here's What to Know

 

 

July 22, 2024 - How much energy does AI use? Ask ChatGPT and this is what it says:

“AI systems vary widely in energy consumption depending on their complexity and usage, but they generally require significant amounts of electricity to process and analyse data efficiently.”

That response required around ten times the electricity of a Google search, by some estimates. And with 100 million users of ChatGPT every week, the extra energy demand starts to add up. And that’s just users on one platform.

Across the industry, the increasing energy demand, primarily from building and running the data centres used to train and operate AI models, is contributing to global greenhouse gas (GHG) emissions.

Microsoft, which has invested in ChatGPT maker OpenAI and has positioned generative AI tools at the heart of its product offering, recently announced its CO2 emissions had risen nearly 30% since 2020 due to data centre expansion. Google’s GHG emissions in 2023 were almost 50% higher than in 2019, largely due to the energy demand tied to data centres.

So while AI tools promise to help the energy transition, they also require significant computing power.

What’s driving AI’s energy demand?

AI’s energy use currently only represents a fraction of the technology sector’s power consumption, which is estimated to be around 2-3% of total global emissions. This is likely to change as more companies, governments and organizations use AI to drive efficiency and productivity. Data centres are already significant drivers of electricity demand growth in many regions, as this chart shows.

Data centre electricity consumption around the world.

Image: IEA

As these systems gain traction and further develop, training and running the models will drive an exponential increase in the number of data centres needed globally – and associated energy use. This will put increasing pressure on already strained electrical grids.

Training generative AI, in particular, is extremely energy intensive and consumes much more electricity than traditional data-centre activities. As one AI researcher said, “When you deploy AI models, you have to have them always on. ChatGPT is never off.”

The growth in sophistication of a large language model, such as the one on which ChatGPT is built, illustrates this escalating demand for energy.

Training a model such as Generative Pre-trained Transformer 3 (or GPT-3) is estimated to use just under 1,300 megawatt hours (MWh) of electricity. This is roughly equivalent to the annual power consumption of 130 homes in the US.

Training the more advanced GPT-4, meanwhile, is estimated to have used 50 times more electricity.

Overall, the computational power needed for sustaining AI’s growth is doubling roughly every 100 days.

How can the AI industry improve its energy efficiency?

This leaves society wrestling with some thorny questions. Do the economic and societal benefits of AI outweigh the environmental cost of using it? And more specifically, do the benefits of AI for the energy transition outweigh its increased energy consumption?

Finding the sweet spot between challenges and opportunities will be key to getting the answers we need.

Reports predict that AI has the potential to help mitigate 5-10% of global GHG emissions by 2030. So what needs to happen to strike the right balance?

Regulators including the European Parliament are beginning to establish requirements for systems to be designed with the capability of logging their energy consumption. And advances in technology could help address AI’s energy demand, with more advanced hardware and processing power expected to improve the efficiency of AI workloads.

Researchers are designing specialized hardware such as new accelerators, new technologies such as 3D chips, which offer much-improved performance, and new chip cooling techniques. Computer chip maker Nvidia claims its new ‘superchip’ can deliver a 30 times performance improvement when running generative AI services, while using 25 times less energy.

Data centres, too, are becoming more efficient. And new cooling technologies and sites that can perform more computations when power is cheaper, more available and more sustainable are being explored to push this efficiency further.

Alongside this, reducing overall data usage – including addressing the issue of dark data, which is data generated and stored but then never used again – will be important. And being more selective about how and where AI is used – for example by using small language models, which are less resource intensive, for specific tasks, will also help. Finding a better balance between performance, costs and the carbon footprint of AI workloads will be key.

What about AI’s impact on the electrical grid?

AI is not the only factor applying pressure to the grid. The energy needs of growing populations and trends towards electrification are creating increased demand that could lead to slower decarbonization of the grid.

Yet a clean, modern and decarbonized grid will be vital in the broader move to a net-zero emissions economy.

Data centre operators are exploring alternative power options, such as nuclear technologies, to power sites or storage technologies such as hydrogen. Companies are also investing in emerging tech such as carbon removal, to suck CO2 out of the air and store it safely.

AI can have a role in overcoming barriers to integrating the necessary vast amounts of renewable energy into existing grids, too.

The variability in renewable energy production often results in overproduction during peak times and underproduction during lulls, leading to wasteful energy consumption and grid instability. By analyzing vast datasets, from weather patterns to energy consumption trends, AI can forecast energy production with remarkable accuracy.

This could enable job scheduling and load shifting to make sure data centres use energy when electricity from renewable energy sources is available – ensuring optimal grid stability, efficiency and 24/7 clean power.

Digital technology including AI could make a significant contribution to helping sectors including energy reach net zero.

 

Image: Accenture, IEA, OECD, WEF, UN and US

AI is also helping to transform the energy efficiency of other carbon-intensive industries, from modelling buildings to predict energy use and optimize the performance of heating and air conditioning to improving the efficiency of manufacturing through predictive maintenance. In agriculture, sensors and satellite imagery are helping to predict crop yields and manage resources.

Balancing AI’s energy use and emissions with its societal benefit takes in many complex, interlinked challenges, and requires a multistakeholder approach.

The World Economic Forum’s Artificial Intelligence Governance Alliance is applying a cross-industry and industry-specific lens to understand how AI can be leveraged to transform sectors and drive impact on innovation, sustainability and growth.

As part of this initiative, the Forum’s Centre for Energy and Materials and Centre for the Fourth Industrial Revolution are launching a dedicated workstream to explore the energy consumption of AI systems and how AI can be leveraged as an enabler for the energy transition.