- Intel and Google signed a multi-year deal to keep Xeon in cloud infrastructure
- Google Cloud instances C4 and N4 already run on Xeon 6 processors
- Intel and Google are co-developing custom IPUs for networking and storage
Intel and Google have announced a multi-year collaboration that will keep Intel Xeon processors at the heart of Google Cloud infrastructure for the foreseeable future.
The agreement spans multiple generations of Xeon chips and includes systems used for AI workloads, inference tasks, and general-purpose computing across Google’s global data centers.
Google Cloud instances such as C4 and N4 already rely on Xeon 6 processors, and this deal ensures that pattern continues.
Article continues below
You may like
Why CPUs still matter in an era of specialized AI hardware
“AI is reshaping how infrastructure is built and scaled,” said Lip-Bu Tan, CEO of Intel.
“Scaling AI requires more than accelerators — it requires balanced systems. CPUs and IPUs are central to delivering the performance, efficiency, and flexibility modern AI workloads demand.”
The announcement comes at a time when many hyperscalers are accelerating adoption of custom Arm-based processors for AI tasks.
Counterpoint Research recently claimed 90% of AI servers running custom silicon will rely on the Arm instruction set architecture, leaving x86 with only a small share of new deployments.
To ensure Xeon remains relevant, Intel and Google are also jointly developing custom infrastructure processing units designed to handle networking, storage, and security workloads.
These IPUs operate as ASIC-based accelerators that move infrastructure tasks away from host CPUs, freeing Xeon processors to focus on application execution.
This separation improves system efficiency and resource allocation across large cloud deployments running AI tools, AI agents, and large language models.
What to read next
CPUs and infrastructure acceleration remain a cornerstone of AI systems — from training orchestration to inference and deployment,” said Amin Vahdat, SVP and Chief Technologist for AI Infrastructure at Google.
Google currently uses both Xeon 5 and Xeon 6 processors across multiple service layers alongside its own custom Arm-based Axion processors.
These deployments continue alongside Google’s own custom processors used in other parts of its infrastructure stack.
Intel and Google state that collaboration across CPUs and IPUs will continue across future system generations, covering ongoing integration efforts across cloud infrastructure layers.
They maintain that CPUs and infrastructure accelerators remain part of current cloud design patterns across distributed systems.
Many workloads running in Google’s data centers require backward compatibility with x86 architecture, while others need maximum single-thread performance that Xeon CPUs deliver.
These requirements are expected to persist for years, which explains why Intel and Google signed this multi-year agreement.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.

