‘It's going to get a lot worse before its better’: Yorkshire tech leaders warn on environmental costs of AI
Microsoft’s plan hopes to mitigate the increase in CO2 emissions which the rise of AI is set to bring, fuelling the energy hungry data centres required for AI to operate through less carbon-intensive means. But Microsoft’s plan will only go a small way to satisfying the huge need for energy which the rise of AI will generate.
A report from Goldman Sachs into the costs of AI on energy consumption estimates that data centre power demand will grow 160 per cent by 2030, with associated co2 emissions potentially more than doubling between 2022 and 2030. These stats, however, are just one of many warning signs of the potential environmental impacts of AI.
Advertisement
Hide AdAdvertisement
Hide AdA number of Yorkshire’s tech leaders have now cautioned on these impacts, as well as calling for more awareness of them from end users.
“I think more people are thinking about sustainability now than a few years ago, but AI is very energy intensive,” says Stuart Clarke, festival director at Leeds Digital Festival.
“You look at all the stats around sending one query on ChatGPT - the technology uses a colossal amount of energy, but we don’t see the big chimneys, so it’s almost as though it's a hidden energy cost.”
Researchers generally agree that the average query put through ChatGPT – the flagship AI model of industry leader OpenAI – requires around 10 times more power than the average Google search.
Advertisement
Hide AdAdvertisement
Hide AdThe training of ChatGPT 3, OpenAI’s third iteration of its model, reportedly required just under 1,300 megawatt hours of electricity, roughly the same amount of power used annually by 130 homes in the US.
According to Intelligent Computing, the computing power required for AI is also doubling every 100 days, and is projected to increase by more than a million times over the next five years.
OpenAI previously told Bloomberg that it takes its responsibility to stop and reverse climate change “very seriously”, and that it thinks a lot about how to best make use of its computing power.
Microsoft and Google, two of the biggest companies in the AI sector, both also have carbon neutral pledges.
Advertisement
Hide AdAdvertisement
Hide AdAlong with an uptick in energy consumption, however, an increasing in the use of data centres will also mean increased demand for cooling, putting further strain on the environment.
“Looking at things like data centres - it's not just electricity power, it's also the water required to cool them,” says Phillip White, founder and managing director of Leeds-based digital transformation firm Audacia.
“There is disparity globally about where that is needed - if you’re in a warmer climate, you need more water. There’s a very small number of organisations with the capacity to train and run these large models, running on machines in massive data centres that require huge amounts of electricity to run and water to cool.
“This is why we are now seeing big investors in AI moving into the energy space funding development of nuclear power plants.”
Advertisement
Hide AdAdvertisement
Hide Ad“As much of the digital world is driven by connected services, it may not always be easy to know when people are using large, energy inefficient, models; for example, if you have a web platform connected to a third party service you don’t have the visibility of what is being implemented further through the chain. Who is responsible for the energy usage?
“It could be me for using the software, the supplier, the person who produced the model – a big problem is with the accountability of the emissions from an activity.”
One of the biggest questions around AI, according to Simon Barratt, co-founder of GREEN-BiT – a Yorkshire-based firm dedicated to reducing the carbon impact of software – is how to make people more aware of its environmental consequences.
“The emissions are in a different place. The same way we take landfill and put it on a boat and ship it somewhere else around the world - it just becomes someone else's problem. It doesn’t seem to click with people,” he says.
Advertisement
Hide AdAdvertisement
Hide Ad“People will sort their recycling or switch their light off but they wouldn't not use AI or a similar thing, even though it might be a much greater cost.
“The hidden cost factor right now means people just cover their eyes or ears and knowingly walk into it and just say ‘we have to keep up’.
“It's hard to criticise those people because of the current climate and market, but if we continue to do that there won't be much of a climate left to do it with, so we need to be responsible at some point. Ultimately, it's going to get a lot worse before its better.”
Comment Guidelines
National World encourages reader discussion on our stories. User feedback, insights and back-and-forth exchanges add a rich layer of context to reporting. Please review our Community Guidelines before commenting.