
- AI Data Centers Are Overwhelming National Grids and Driving Up Energy Costs
- Enterprises are turning to nuclear options to sustain power-hungry AI workloads
- OpenAI urges government to massively expand national power generation capacity
Microsoft CEO Satya Nadella has drawn attention to a less discussed obstacle in the AI race: the shortage not of processors but of power.
Speaking on a podcast alongside OpenAI CEO Sam Altman, Nadella said Microsoft has “a bunch of chips in inventory that I can’t connect.”
“The biggest problem we’re having now is not over-computing, it’s power; it’s sort of the ability to get builds done fast enough close to power,” Nadella added, “in fact, that’s my problem today. It’s not a chip supply issue; it’s actually the fact that I don’t have warm cases to plug into.”
Energy limitations reshape the AI landscape
Nadella explained that while the supply of GPUs is currently sufficient, the lack of adequate facilities to power them has become a critical issue.
In this context, he described “hot shells” as empty data center buildings, ready to house hardware but dependent on access to adequate power infrastructure.
This shows that the explosive growth of AI tools has exposed vulnerabilities and that the demand for computing power has outpaced the ability to build and drive new technologies. data center sites.
Energy planning across the tech industry is a huge topic, and even large companies like Microsoft, with vast resources, are still struggling to keep up.
To solve this problem, some companies, including major cloud providers, are now investigating nuclear-based energy solutions to sustain their rapid expansion.
Nadella’s comments reflect a broader concern that AI infrastructure is pushing national power grids to the limit.
As data center construction accelerates across the United States, energy-intensive AI workloads have already begun to influence electricity prices for consumers.
OpenAI has even urged the US government to commit to building 100 gigawatts of new power generation capacity annually.
He argues that energy security is becoming as important as access to semiconductors in the competition with China.
Analysts have noted that Beijing’s advantage in developing hydropower and nuclear power could give it an advantage in maintaining AI infrastructure at scale.
Altman also hinted at a potential shift toward more capable consumer devices that could one day run advanced models like GPT-5 or GPT-6 locally.
If CPU and chip innovation enables low-power systems, much of the projected demand for cloud-based AI processing could disappear.
This possibility presents a long-term risk for companies that invest heavily in massive data center networks.
Some experts believe such a shift could even accelerate the eventual bursting of what they describe as an AI-driven economic bubble, which could threaten trillions of dollars in market value if expectations collapse.
Via TomsHardware
Follow TechRadar on Google News and add us as a preferred source to receive news, reviews and opinions from our experts in your feeds. Be sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp also.



