Innovation

AI May Soon Gobble Up as Much Power as Sweden Each Year

ELECTRIC FEEL

Our bots are already having a huge impact on our environment.

A power plant emitting smoke from its smokestacks.
Ella Ivanescu via Unsplash

Artificial intelligence companies are power hungry—in more ways than one.

A new commentary published Tuesday in the journal Joule argues that AI bots such as OpenAI’s ChatGPT or Google’s Bard may one day soon use as much energy as a small country. The increase in AI demand, as well as the resources necessary to power them, will further exacerbate climate change issues due to the greenhouse gas emissions.

“Looking at the growing demand for AI service, it’s very likely that energy consumption related to AI will significantly increase in the coming years,” Alex de Vries, founder of digital trends watchdog Digiconomist and sustainability at Vrije Universiteit Amsterdam, said in a statement.

ADVERTISEMENT

From the very beginning of its development, AI consumes an ungodly amount of energy. Some research has estimated that simply training ChatGPT required 1,287 MWh of energy. That’s more than what 120 U.S. households use in an entire year. AI lab Hugging Face reported that training its own AI chatbot required roughly 433 MWh, which is the equivalent to 40 U.S. homes.

And remember: That’s just to train these models. Others have estimated that ChatGPT might use up to 1,000 MWh each day. That’s roughly the same as 34,000 U.S. households annually.

De Vries notes the energy consumption is likely to worsen as more and more companies begin to incorporate AI models into their products. For example, Google recently announced a host of AI tools to its products such as Gmail and Google Sheets. It’s also begun to roll out an AI-assisted search engine for its nearly 9 billion search queries a day. De Vries estimates that Google alone will require 29.2 TWh of power annually—the equivalent of the annual power consumption of Ireland.

By 2027, he says, AI power consumption will increase by as much as 134 TWh, meaning these systems will be consuming as much energy as countries like Argentina, the Netherlands, and Sweden each year.

Interestingly, even though companies are attempting to make their AI systems more energy efficient, demand will simply increase as a result—so they’ll become even more energy intensive. “The result of making these tools more efficient and accessible can be that we just allow more applications of it and more people to use it,” de Vries said.

This great need for energy is the reason why companies like OpenAI and Microsoft are beginning to explore alternative sources of power such as nuclear energy. These energy systems will no doubt become more and more relevant as the need for power grows with the AI boom—which climate scientists fear will have a disastrous impact on the environment.

However, de Vries and other experts believe that the most effective way to address this growing need for energy is to be much more judicious about how we use our AI systems.

“The potential growth highlights that we need to be very mindful about what we use AI for,” he said. “It’s energy intensive, so we don’t want to put it in all kinds of things where we don’t actually need it.”

Got a tip? Send it to The Daily Beast here.