AI on Track to Gobble Up as Much Energy as a Country, Study Finds

AI on Track to Gobble Up as Much Energy as a Country, Study Finds

Artificial intelligence may have a larger impact on the environment than previously thought, according to a study published this week in the journal Joule. The research found that AI could undermine efforts to reduce carbon emissions by eventually consuming as much energy as a country the size of Sweden.

That could happen in just a few years, based on how quickly the technology is advancing, according to Alex de Vries, a PhD candidate at the VU Amsterdam School of Business and Economics and author of the new paper. De Vries explains in his study that large language models (LLMs) like ChatGPT consume substantial datasets to train the AI. “If you’re going to be expending a lot of resources and setting up these really large models and trying them for some time, that’s going to be a potential big waste of power,” de Vries told The Verge.

Training AI models uses a lot of energy, but that’s not the only concern. De Vries noted that Google reported 60% of its AI-related energy consumption from 2019 through 2021 stemmed from what’s called the inference phase of production. After AI models are trained, they transition into the inference phase, generating information based on new inputs. While past work has looked at the energy consumed by AI training, de Vries says more needs to be done to account for the full AI life cycle.

Energy production is responsible for more than three-quarters of global greenhouse emissions, according to the International Energy Agency. Pumping these gases into our atmosphere is warming the climate, and humanity has a “rapidly closing window” to reverse course, the most recent Intergovernmental Panel on Climate Change report warned.

“Given the expected production in the coming few years, by 2027 newly manufactured AI devices will be responsible for as much electricity consumption as my home country, the Netherlands,” de Vries told Insider. “This is also in the same range as the electricity consumption of countries like Sweden or Argentina.”

As AI products become widely available and are adopted by more companies, the demand for AI chips is on the rise, with Nvidia reportedly bringing in $US13.5 billion in the second quarter of 2023, the study notes.

“The 141% increase in the company’s data center segment compared to the previous quarter underscores the burgeoning demand for AI products, potentially leading to a significant rise in AI’s energy footprint,” de Vries wrote. “For example, companies such as Alphabet’s Google could substantially increase their power demand if generative AI is integrated into every Google search.”

In August, Google unveiled new AI technology, including creating its own AI chips designed to train LLMs, and added AI-generated software to identify images. The company also added 20 AI models to its cloud service, bringing its total to 100.

De Vries acknowledges in the study: “The worst-case scenario suggests Google’s AI alone could consume as much electricity as a country such as Ireland,” but he says it is more likely that, as AI evolves, technology will likewise change to better support AI. He argues that, regardless of the end result, AI developers should be mindful of when and how they use the technology.

“Everyone should be pretty mindful about whether or not they really need to be trying to put AI into their applications,” he told Insider, adding, “It’s not a miracle cure for everything.”


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.