Texas power grid to be tested by demand for data centers, AI, experts say

Sharing is Caring!

Arm CEO warns AI’s power appetite could devour 25% of US electricity by 2030

Arm CEO Rene Haas cautions that if AI continues to get more powerful without boosts in power efficiency, datacenters could consume extreme amounts of electricity.

Haas estimates that while US power consumption by AI datacenters sits at a modest four percent, he expects the industry to trend towards 20 to 25 percent usage of the US power grid by 2030, per a report from the Wall Street Journal. He specifically lays blame at popular large language models (LLMs) such as ChatGPT, which Haas described as “insatiable in terms of their thirst.”

The Arm CEO isn’t alone in making this prediction. The International Energy Agency’s (IEA) Electricity 2024 report [PDF] expects power consumption for AI datacenters around the world to be ten times the amount it was in 2022. Part of the problem is that LLMs like ChatGPT require far more power than traditional search engines like Google. The IEA estimates that one ChatGPT request consumes almost ten times as much power as a Google search.

See also  Paul Craig Roberts: Texas Is Being Turned Into a Woke Democrat State
See also  Chicago Teachers Union's underperforming schools demand expansion despite zero proficiency in reading and math.

www.theregister.com/2024/04/09/ai_datacenters_unsustainable/

Views: 83

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.