- Intel and Google signed a multi-year agreement to maintain Xeon in cloud infrastructure
- Google Cloud C4 and N4 instances now running on Xeon 6 processors
- Intel and Google are jointly developing custom IPUs for networking and storage
Intel and Google have announced a multi-year collaboration that will keep Intel Xeon processors at the heart of Google Cloud infrastructure for the foreseeable future.
The deal spans multiple generations of Xeon chips and includes systems used for AI workloads, inference tasks and general-purpose computing in Google’s global data centers.
Google Cloud instances like C4 and N4 already rely on Xeon 6 processors, and this deal ensures that pattern continues.
Article continues below.
Why CPUs are still important in an era of specialized AI hardware
“AI is changing the way infrastructure is built and scaled,” said Lip-Bu Tan, CEO of Intel.
“Scaling AI requires more than accelerators, it requires balanced systems. CPUs and IPUs are critical to delivering the performance, efficiency, and flexibility that modern AI workloads demand.”
The announcement comes at a time when many hyperscalers are accelerating the adoption of custom Arm-based processors for AI tasks.
Counterpoint Research recently claimed that 90% of AI servers running custom silicon will rely on Arm instruction set architecture, leaving x86 with only a small portion of new implementations.
To ensure Xeon remains relevant, Intel and Google are also jointly developing custom infrastructure processing units designed to handle networking, storage, and security workloads.
These IPUs function as ASIC-based accelerators that move infrastructure tasks away from the host CPUs, freeing the Xeon processors to focus on application execution.
This separation improves system efficiency and resource allocation in large cloud deployments running large AI tools, AI agents, and language models.
“CPUs and infrastructure acceleration remain the cornerstone of AI systems, from training orchestration to inference and deployment,” said Amin Vahdat, senior vice president and chief technologist of AI infrastructure at Google.
Google currently uses Xeon 5 and Xeon 6 processors across multiple service layers alongside its own custom Arm-based Axion processors.
These implementations continue alongside Google’s custom processors used in other parts of its infrastructure.
Intel and Google say collaboration between CPUs and IPUs will continue in future generations of systems, covering ongoing integration efforts across cloud infrastructure layers.
They argue that CPUs and infrastructure accelerators remain part of current cloud design patterns in all distributed systems.
Many workloads running in Google data centers require backward compatibility with the x86 architecture, while others need the maximum single-threaded performance offered by Xeon CPUs.
These requirements are expected to persist for years, which explains why Intel and Google signed this multi-year agreement.
Follow TechRadar on Google News and add us as a preferred source to receive news, reviews and opinions from our experts in your feeds. Be sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp also.




