
AI is growing faster than ever! In just two years, ChatGPT has gained 400 million weekly active users — that’s a lot of curious minds.
But while we all talk about how cool and helpful AI is, there are two big challenges that don’t get enough attention: electricity and water need.
You see, AI doesn’t just run on “smart.” It runs on power — and a lot of it. Behind every chatbot, image generator, or smart system is a data center. These are giant buildings full of supercomputers that consume huge amounts of electricity and use large amounts of clean water to stay cool.
A report from Goldman Sachs says that by 2030, data centers will use 165% more electricity than they did in 2023!
In fact, AI in the U.S. alone will need more power just to process data than the total electricity used to make heavy-duty stuff like steel, cement, and chemicals — all combined.
And here’s another issue: our power grids are old. In the U.S., the grids are about 40 years old. In Europe, they are even older — around 50 years. These aging systems are struggling to keep up with AI’s power demands.
On top of all that, we’re also trying to cut down carbon emissions and go green. But AI is making this trickier because it needs so much power, and a lot of that still comes from fossil fuels like coal, gas, and oil. So, we have to move faster toward cleaner sources like solar, wind, hydro, and nuclear power.
Now, it’s not all doom and gloom. Future data centers are being designed to be smarter and more efficient. They’ll use less electricity and recycle water more efficiently. Some new rules even allow companies to build their own power and water systems right next to their data centers. That’s a big step in the right direction.
Still, these power and water problems are real, and we need to start talking about them more — because the AI-powered world is already here.
--------------------------------------------
References:
https://time.com/7272558/ai-rising-global-electricity-demand/--