If every search on Google used AI similar to ChatGPT, it might burn through as much electricity annually as the country of Ireland. Why? Adding generative AI to Google Search increases its energy use more than tenfold, according to a new analysis.
The paper published today in the journal Joule starts to paint a picture of what AI’s environmental impact might be as it starts to permeate into seemingly every nook and cranny of pop culture and work life. Generative AI requires powerful servers, and the worry is that all that computing power could make data centers’ energy consumption and carbon footprint balloon.
The new analysis was written by Alex de Vries, a researcher who has called attention to pollution stemming from crypto mining with his website Digiconomist. As he turns his attention to AI, he says it’s still too early to calculate how much planet-heating pollution might be associated with new tools like ChatGPT and similar AI-driven apps. But it’s worth paying attention now, he says, to avoid runaway emissions.
“That’s going to be a potential big waste of power.”
“A key takeaway from the article is this call to action for people to just be mindful about what they’re going to be using AI for,” de Vries tells The Verge. “This is not specific to AI. Even with blockchain, we have a similar phase where everyone just saw blockchain as a miracle cure ... if you’re going to be expending a lot of resources and setting up these really large models and trying them for some time, that’s going to be a potential big waste of power.”
AI already accounted for 10 to 15 percent of Google’s electricity consumption in 2021. And the company’s AI ambitions have grown big time since then. Last week, Google even showed off new AI-powered tools for policymakers to cut down tailpipe emissions and prepare communities for climate change-related disasters like floods and wildfires.
“Certainly, AI is at an inflection point right now. And so you know, predicting the future growth of energy use and emissions from AI compute in our data centers is challenging. But if we look historically at research and also our own experience, it’s that AI compute demand has gone up much more slowly than the power needed for it,” Google chief sustainability officer Kate Brandt said in a press briefing last week.
“The energy needed to power this technology is increasing at a much slower rate than many forecasts have predicted,” Corina Standiford, a spokesperson for Google, said in an email. “We have used tested practices to reduce the carbon footprint of workloads by large margins, helping reduce the energy of training a model by up to 100x and emissions by up to 1,000x. We plan to continue applying these tested practices and to keep developing new ways to make AI computing more efficient.”
To be sure, de Vries’ writes that Google Search one day using as much electricity as Ireland thanks to energy-hungry AI is an unlikely worst-case scenario. It’s based on an assumption that Google would shell out tens of billions of dollars for 512,821 of Nvidia’s A100 HGX servers, for which he writes that Nvidia does not have the production capacity.
The paper includes a little more realistic scenario calculating the potential energy consumption of the 100,000 AI servers Nvidia is expected to deliver this year. Running at full capacity, those servers might burn through 5.7 to 8.9 TWh of electricity a year. That’s “almost negligible” in comparison to data centers’ historical estimated annual electricity use of 205 TWh, de Vries writes. In an email to The Verge, NVIDIA says that its products are energy efficient with each new generation
Even so, that electricity use could grow sharply if AI’s popularity continues to skyrocket and supply chain constraints loosen, de Vries writes. By 2027, if Nvidia ships 1.5 million AI servers, that would eat up 85.4 to 134.0 TWh of electricity annually. That rivals the energy hunger of Bitcoin today.