- Nvidia controls the processors and networks, forming the backbone of today’s AI factories
- Nvidia could soon control not only chips but also power, models and applications
- Huang frames AI not as software, but as the foundation of modern industry
Nvidia CEO Jensen Huang recently described artificial intelligence through the metaphor of a multi-layered system.
The framework explains how modern AI systems operate as an industrial chain rather than isolated software tools.
The structure consists of five layers: energy, chips, infrastructure, models and applications, which interact with industries and consumers.
Article continues below.
How the AI stack works across all layers
“Every successful application draws on every underlying layer, all the way to the power plant that keeps it alive,” Huang wrote, illustrating how intelligence generated in real time depends on physical resources across the computing ecosystem.
Nvidia already dominates the processor layer, supplies networking technologies, and provides computing platforms within large data centers.
The company’s influence on infrastructure includes systems that connect thousands of processors into machines capable of continuously generating intelligence.
These facilities, sometimes described as AI factories, require land, power supply and networking systems to operate at scale.
Huang noted that construction of new chip manufacturing plants, computer assembly facilities and data centers is underway in multiple regions.
“We are invested a few hundred billion dollars,” he wrote. “Trillions of dollars in infrastructure still need to be built.”
The expansion reflects one of the largest industrial developments associated with modern computing.
At the top of the stack are applications that convert computing power into economic value.
Huang cited examples, including drug discovery platforms, industrial robotics, legal analysis tools and autonomous vehicles, which act as physical embodiments of artificial intelligence.
“A self-driving car is an application of artificial intelligence built into a machine,” he wrote. “A humanoid robot is an application of artificial intelligence embedded in a body.”
These systems are based on models capable of processing language, images, scientific data, and real-world environments, which increases the demand for computing resources in the lower layers of the stack.
The framework also suggests how Nvidia could expand across the layers it describes.
Companies that control core technology sometimes extend to adjacent layers, similar to what Amazon did after building AWS.
Nvidia has been actively expanding into large-scale computing infrastructure and networking systems.
The company has also invested in areas such as photonics that affect how data moves between computer systems.
If Nvidia expands further into models, infrastructure, power delivery or applications, the company could operate at most of the layers described in Huang’s framework.
By framing AI as a layered stack, Nvidia not only explains the industry, but also defends its right to it.
From chips to infrastructure to applications, the company wants to have its cake and eat it too.
Follow TechRadar on Google News and add us as a preferred source to receive news, reviews and opinions from our experts in your feeds. Be sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp also.




