AI’s Energy Appetite Set to Match Argentina’s Annual Consumption by 2027!
In a groundbreaking study, ‘environmental stability’ has now emerged as a critical concern within the AI industry, joining the ranks of existing challenges
Artificial intelligence programs are really good at doing stuff, but they use a lot of electricity. This is a problem, especially when we want to use clean energy.
Some smart people have looked into it and think AI might need a lot of power in the future, which could be a big issue.
In a commentary published on October 10th in Joule, Alex de Vries, a PhD candidate in Business and Economics at Vrije Universiteit Amsterdam, suggests that the electricity used by AI worldwide might reach a whopping 134 terawatt-hours every year by 2027.
To put it in perspective, that’s about as much electricity as countries like Argentina, the Netherlands, and Sweden use in a year.
While de Vries mentions that data centers’ power usage increased only about 6% from 2010 to 2018 (excluding energy-hungry cryptocurrency mining), there’s growing worry that the demand for computing resources to create and run AI models and applications could significantly boost data centers’ share of global electricity use.
With many industries embracing AI in the past year, it’s not hard to picture this surge becoming a reality. For instance, if a big AI player like Google, which already uses a lot of AI, incorporated technology similar to ChatGPT into its 9 billion daily searches, it could consume as much as 29.2 terawatt-hours of power every year, which is equivalent to Ireland’s annual electricity consumption.
de Vries, the founder of the digital trend watchdog research company Digiconomist, believes that an extreme scenario like Google’s massive AI usage is not very likely due to high costs of AI servers & supply chain issues.
However, as AI becomes more widespread, its energy demands will certainly increase. This calls for careful consideration of when to use these technologies.
For instance, this year, NVIDIA plans to provide 100,000 AI servers to customers. If all of them run at full capacity, they would need between 650 and 1,020 megawatts of power, equaling 5.7-8.9 terawatt-hours of electricity annually. Compared to data center power usage, this is relatively small.
But by 2027, NVIDIA might be delivering 1.5 million AI servers each year. Using similar power consumption rates, this could amount to 85-134 terawatt-hours of electricity annually.
At this point, these servers could significantly contribute to global data center electricity consumption, according to de Vries.
As de Vries points out, AI is not a magic solution for everything. It still has to deal with concerns like privacy, biases, and hallucinations. Now, we can add environmental sustainability to the list of worries about AI.