Tech Talk | The AI energy challenge
Image: matteoguedia via 123RF.com
The rapid rise of artificial intelligence (AI) is bringing raising questions and concerns about its energy consumption.
Data centres, crypto mining and AI are ultimately different sides of the same coin, requiring banks of processors churning away often on a 24/7 basis with their need for energy for operations, cooling and other associated tasks such as communications for the in and out data flows.
In scale, the traditional data centre sector is the largest from the energy perspective, while the crypto sector has garnered the most criticism and publicity driving a rapid shift towards more sustainable – but in some locations still controversial – operations.
But AI remains something of an uncertainty with its accelerating growth. While already widely used but still growing in business applications it is yet to take off at the consumer level – and as it is made more accessible that use is likely to be massive.
Have you read?
British Gas engages with Samsung for energy management for customers
The EU Projects Zone Podcast: Using AI to boost the grid and renewables
Currently, most consumers are likely to be largely unaware of AI directly, although it contributes to many aspects of daily life.
For many ChatGPT introduced in November 2022 was probably their first ‘hands on’ experience.
Subsequently, over the past few months, Microsoft has been piloting its Copilot, which is built on the ChatGPT technology.
But now AI is entering the mass market with Samsung’s new S24 range of Galaxy mobiles full of AI-powered features.
With mobiles the ‘aways to hand’ device and a selection of what on paper at least appear compelling applications – real-time translation to other languages and wallpaper picture generation to name two – AI is likely to rapidly become part of the daily usage of these devices and others that follow suit, not to mention the possibility of Google’s Bard getting more prominence in the Android OS.
Energy consumption
The IEA’s Electricity 2024 review released last week reports data centres, cryptocurrencies and AI – of which there are more than 8,000 around the world – consuming an estimated 460TWh of electricity globally in 2022, amounting to almost 2% of the global demand.
Of this, the majority, almost three-quarters, was from traditional data centres, with almost all of the rest from cryptocurrencies.
Within three years by 2026 that demand could double and potentially exceed 1,000TWh, the IEA estimates from its modelling.
In particular, AI’s energy demand is projected to grow exponentially to at least ten times its 2023 demand level, which would put it in the range of 70-100TWh.
As an example of how demand could increase the IEA points to search tools such as Google, which could see a tenfold increase of their electricity demand with the full implementation of AI in the process.
The current average electricity demand of a typical Google search is 0.3Wh of electricity compared to ChatGPT’s of 2.9Wh per request, and scaling that up to 9 billion searches daily amounts to an almost 10TWh additional electricity requirement in a year.
Energy challenge
One approach to the energy challenge is greater efficiencies in the data centres themselves.
Currently, the servers and cooling systems are each responsible for about 40% of the demand, with the remaining 20% consumed by the power supply system, storage devices and communication equipment.
More efficient cooling, currently at least, offers the greatest benefits but the move towards hyperscale data centres with upwards of 2,000 racks is achieving energy savings and in the future quantum computers potentially could replace the traditional servers.
Temporary time and location shifting of data centre workloads to regions with lower carbon intensity also is considered to have potential.
Arguably the most significant approach advocated in a conversation with Bloomberg at the World Economic Forum meeting by Sam Altman, CEO of ChatGPT developer OpenAI, is for a breakthrough with more climate-friendly sources of energy such as nuclear.
“The two important currencies of the future are compute/intelligence and energy and I think we still don’t appreciate the energy needs of this technology,” he said, stating that those energy needs will “force us to invest more in the technologies that can deliver this”.
One nuclear option is SMRs, with their potential for an onsite power supply, but closer to Altman’s heart is fusion, in which he has invested $375 million in the private US company Helion Energy.
Helion Energy, vowing to be first with fusion, already has a power purchase agreement in place to supply Microsoft from a plant deployment in 2028 and is targeting 2030 to start supplying baseload power to a Nucor steelmaking facility.
With fusion under development for well over half a century and AI playing an important role in its ongoing advancement, there would be a nice sense of ‘circularity’ if its energy demands were, even indirectly, to impact in finally delivering that breakthrough.
As a user of AI, particularly generative AI in its emergence as a distinct sub-genre, let us know how you are applying it in your utility.
Jonathan Spencer Jones
Specialist writer
Smart Energy International
Follow me on Linkedin