A coalition of environmental, tech, and anti-hate speech groups sent a letter to Senator Chuck Schumer (D-NY) today demanding that the Democratic leader craft policy to address the growing impact AI could have on climate change.
Companies should be required to disclose the environmental impact of developing energy-intensive AI models, the letter says. And legislation aimed at curtailing the misuse of AI should include measures to prevent disinformation about climate change from spreading with the help of AI, the coalition writes.
The letter was signed by Amazon Employees for Climate Justice, Greenpeace USA, the Union of Concerned Scientists, the Center for Countering Digital Hate, and more than a dozen other groups.
The letter was signed by Amazon Employees for Climate Justice, Greenpeace USA, the Union of Concerned Scientists, the Center for Countering Digital Hate, and more than a dozen other groups
Large language models, like the kind behind ChatGPT, are at the heart of the groups’ AI concerns. “The energy use of LLMs must be monitored and disclosed transparently, allowing both consumers and policymakers to understand the trade-off of such technology,” the letter says. “Second, the ease and speed with which people and organizations can use LLMs to produce and distribute climate disinformation threatens to perpetuate climate denialism and slow efforts to fight climate change.”
It takes a lot of energy to train large language models, which can drive up planet-heating carbon dioxide emissions. The amount of compute needed to train large AI models grew a whopping 300,000x between 2012 and 2018, according to one analysis, and that was before today’s generative AI boom. AI accounted for nearly 15 percent of Google’s energy use over three years, equivalent to roughly as much electricity as all of the homes in the city of Atlanta might use annually, according to a 2022 analysis.
Companies should have to publicly report the energy consumption and greenhouse gas emissions stemming from the entire life cycle of their AI models, the letter says. That should include the environmental impact of users’ search queries, on top of the training needed to develop and update the models. It’s also important to take stock of what kinds of metals and critical minerals are used for AI and assess whether that affects the availability of those key resources for clean energy, the coalition argues. Silicon is used in computer chips and solar cells, for example.
Worry is also growing that AI tools could supercharge disinformation campaigns. A study published earlier this year found that people were more likely to trust tweets generated by the language model GPT-3 than science content written by humans, including content about climate change.
The letter goes as far as saying that companies and executives ought to be “held liable for harms that occur as a result of generative AI, including harms to the environment, while preserving free expression and human rights.” It also says companies should be able to explain to regulators and the general public how their generative AI models create content and how they measure accuracy.
Schumer has urged his colleagues in Congress to craft rules for regulating AI. That includes convening what he calls “AI Insight Forums” this year. The first one will be on Wednesday, a closed-door meeting between senators and Big Tech leaders including Microsoft co-founder Bill Gates, Meta CEO Mark Zuckerberg, and Sam Altman, CEO of OpenAI, the company that developed ChatGPT. Congress also plans to hold a slew of hearings on AI oversight throughout the week.
AI isn’t the only industry facing scrutiny over its carbon footprint. Democrats have also sought to force energy-hungry Bitcoin mining companies to disclose their electricity consumption and pollution. The SEC has proposed requiring publicly traded companies to report the climate impact of their operations and supply chains. California is also expected to vote this week on a bill that would similarly mandate greenhouse gas emissions disclosures from big companies doing business in the state.