It might seem like magic. Type a request into ChatGPT, click a button and — presto! — here’s a five-paragraph analysis of Shakespeare’s Hamlet and, as an added bonus, it’s written in iambic pentameter. Or tell DALL-E about the chimeric animal from your dream, and out comes an image of a gecko-wolf-starfish hybrid. If you’re feeling down, call up the digital “ghost” of your deceased grandmother and receive some comfort (SN: 6/15/24, p. 10).
Despite how it may appear, none of this materializes out of thin air. Every interaction with a chatbot or other generative AI system funnels through wires and cables to a data center — a warehouse full of server stacks that pass these prompts through the billions (and potentially trillions) of parameters that dictate how a generative model responds.
Processing and answering prompts eats up electricity, as does the supporting infrastructure like fans and air conditioning that cool the whirring servers. In addition to big utility bills, the result is a lot of climate-warming carbon emissions. Electricity generation and server cooling also suck up tons of water, which is used in fossil fuel and nuclear energy production, and for evaporative or liquid heat dissipation systems.
This year, as the popularity of generative AI continued to surge, environmentalists sounded the alarm about this resource-hungry technology. The debate over how to weigh the costs against the less tangible benefits that generative AI brings, such as increased productivity and information access, is steeped in ideological divisions over the purpose and value of technology.
Advocates argue this latest revolution in AI is a societal good, even a necessity, that’s bringing us closer than ever to artificial general intelligence, hypercapable computer systems that some argue could be a paradigm-shifting technology on par with the printing press or the internet.
Generative AI “is an accelerator for anything you want to do,” says Rick Stevens, an associate lab director at Argonne National Laboratory and a computer scientist at the University of Chicago. In his view, the tech has already enabled major productivity gains for businesses and researchers.
One analysis found 40 percent gains in performance when skilled workers used AI tools, he notes. AI assistants can boost vocabulary learning in schools, he adds. Or help physicians diagnose and treat patients, and improve access to medical information, says Charlotte Blease, an interdisciplinary researcher at Uppsala University in Sweden who studies health data. Generative AI might even help city planners cut down on traffic (and reduce carbon emissions in the process), or help government agencies better forecast the weather, says Priya Donti, an electrical engineer and computer scientist at MIT and cofounder of the nonprofit Climate Change AI. The list goes on.
Now, at this critical juncture, experts from fields as varied as economics, computer engineering and sustainability are working to assess the true burden of the technology.
How much energy does AI consume?
ChatGPT and other generative tools are power hungry, says Alex de Vries, founder of the research and consulting agency Digiconomist and a Ph.D. candidate at Vrije Universiteit Amsterdam. “The larger you make these models — the more parameters, the more data — the better they perform. But of course, bigger also requires more computational resources to train and run them, requiring more power,” says de Vries, who studies the environmental impact of technologies like cryptocurrency and AI. “Bigger is better works for generative AI, but it doesn’t work for the environment.”
Training generative AI models to spit out an analysis of Shakespeare or the image of a fantastical animal is costly. The process involves developing an AI architecture, amassing and storing reams of digital data and then having the AI system ingest and incorporate that data — which can amount to everything publicly available on the internet — into its decision- making processes. Honing models to be more humanlike and avoid unsafe responses takes additional effort (SN: 1/27/24, p. 18).
All told, training a single model uses more energy than 100 U.S. homes in a year. Querying ChatGPT uses about 10 times as much energy as a standard online search, according to the International Energy Agency. Composing an email with an AI chatbot can take seven times as much energy as fully charging an iPhone 16, some researchers estimate.
Though training is clearly a big resource suck, when millions of people rely on chatbots for everyday tasks, it adds up, says Shaolei Ren, an electrical and computer engineer at the University of California, Riverside. So much so that the AI sector could soon draw as much energy annually as the Netherlands, de Vries estimated in 2023 in Joule. Given generative AI’s rapid growth, the current trajectory already exceeds the prediction.
And that’s just electricity. Ten to 50 ChatGPT queries use half a liter of water, per a 2023 analysis by Ren and colleagues. That turned out to be a big underestimate too, he says, off by a factor of four.
Some engineers and AI experts dispute these numbers. “I don’t understand what the science is behind these [estimates],” says David Patterson, an engineer at Google and professor emeritus at the University of California, Berkeley. “The only way I can imagine getting an [accurate] answer would be with close cooperation with a company like Google.”
Right now, that’s impossible. Tech companies release limited information about their data centers and AI models, say de Vries and Ren. So it’s hard to precisely assess the cradle-to-grave cost of AI or predict the future. In their estimates, both researchers relied on proxies, such as AI server production numbers from the tech company Nvidia or combining knowledge on data center locations with info from corporate sustainability reports.
Real-world trends, however, do point to AI’s voracious energy appetite. For decades before the generative AI boom, efficiency gains have compensated for the growing energy demand that’s come with expansions in data centers and computing, says Andrew Chien, a computer scientist at the University of Chicago. That’s changed. By the end of 2020, data center expansion began to outpace efficiency improvements, he says. Both Google’s and Microsoft’s self-reported energy usage more than doubled between 2019 and 2023. ChatGPT’s release at the end of 2022 kick-started a generative AI frenzy — exacerbating the issue, Chien says. Before 2022, total energy demand in the United States had been stable for about 15 years. Now it’s rising.
“The easiest way to save energy is to not do anything,” Patterson says. But “progress involves investment and costs.” Generative AI is a very young technology, and stopping now would stymie its potential, he argues. “It’s too early to know that [generative AI] won’t more than compensate the investment.”
A more sustainable path for AI
The decision need not be between shutting down generative AI development entirely or allowing it to continue unrestrained. Instead, most experts note there’s a more responsible way to approach the technology, mitigating the risks and maximizing the rewards.
Policies requiring companies to disclose where and how they’re using generative AI, as well as the corresponding energy consumption, would be a step in the right direction, says Lynn Kaack, a computer science and public policy expert at the Hertie School in Berlin. Regulating uses of the technology and access to it may prove difficult, but Kaack says that’s key to minimizing environmental and social harm.
Perhaps not everyone, for instance, should be able to freely produce voice clones and photorealistic images with a single click. Should we pour the same amount of resources into supporting a generative meme machine as we do for running a hurricane forecasting model?
More research into the tech’s limitations could also save lots of futile consumption. AI “is very powerful in certain kinds of applications, but completely useless in others,” Kaack says.
Meanwhile, data centers and AI developers could take steps to lessen their carbon emissions and resource use, Chien says. Simple changes like training models only when there’s ample carbon-free power on the grid (say, on sunny days when solar panels produce an excess of energy) or subtly reducing system performance at times of peak energy demand might make a measurable difference. Replacing water-intensive evaporative cooling with liquid- immersion cooling or other closed-loop strategies that allow for water recycling would also minimize demand.
Each of these choices involves trade-offs. More carbon-efficient systems generally use more water, Ren says. There is no one-size-fits-all solution. The alternative to exploring and incentivizing these options — even if they make it marginally harder for companies to develop ever-bigger AI models — is risking part of our collective environmental fate, he says.
“There’s no reason to believe that technology is going to save us,” Chien says — so why not hedge our bets?
Source: Heart - www.sciencenews.org