- AWS AI Factories Place Amazon/Nvidia Hardware on Customer Premises
- They are designed to respond to strict data sovereignty/privacy requirements.
- The return to local has gained traction in an AI-driven era
Amazon Web Services has revealed more information about its AI factories: full-stack AI infrastructure that sits within the customer’s own data center.
This means that customers would provide the facilities and power, and Amazon’s cloud division would provide and manage AI systems, in a manner similar to AI factories with a private AWS region.
In addition to giving organizations more control over data sovereignty, security or regulatory requirements, it also ensures they have access to hardware options, such as Nvidia’s Blackwell GPUs or Amazon’s Trainium3 accelerators.
AWS AI Factories are on-premises facilities with shared responsibilities
Why would a customer want to increase the pressure on themselves by becoming responsible for location and energy? It’s simple: certain companies and governments want access to advanced AI, but are limited in terms of the data they can send off-premises.
Building standalone AI infrastructure is slow and expensive, but AWS says it can deploy these systems in months, helping customers avoid large capital burdens.
Since AWS manages the entire AI environment exclusively for a single customer, data remains local and hardware will not be shared with others.
The shift to on-premises infrastructure is an interesting step back from the cloud push we’ve seen in recent years, with companies largely concerned about sensitive data, AI training, and national security.
“By combining NVIDIA’s latest Grace Blackwell and Vera Rubin architectures with AWS’s high-performance, secure infrastructure and AI software stack, AWS AI Factories enables organizations to deploy powerful AI capabilities in a fraction of the time and focus entirely on innovation rather than integration,” said Ian Buck, vice president and general manager of Hyperscale and HPC at Nvidia.
But Amazon is not the only one pushing the concept of AI factories. Microsoft relies on Azure Local to support sovereignty requirements, which comprises Microsoft-managed hardware installed within a customer’s premises.
Follow TechRadar on Google News and add us as a preferred source to receive news, reviews and opinions from our experts in your feeds. Be sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp also.




