Channel, News, Vendor

Intel and Google expand AI chip partnership

Google recently expanded its partnership with Intel by committing to use multiple generations of the chipmaker’s CPUs in its AI data centres. Under this expanded agreement, Intel’s latest Xeon 6 processors will support AI training and inference workloads.

Google’s Amin Vahdat, SVP and chief Technologist, AI infrastructure, stated Intel’s Xeon roadmap gives his company confidence it “can continue to meet the growing performance and efficiency demands of our workloads”.

Intel CEO Lip‑Bu Tan noted scaling AI requires more balanced systems rather than accelerators alone. The deal comes as CPUs regain strategic importance in AI systems, with processors becoming a bottleneck as agentic AI workloads extend beyond GPUs.

“The modern data centre needs to have CPUs to do much of the processing that goes on around the accelerators, and that’s increasingly important as we move to AI Inference from training”, analyst Jack Gold told Mobile World Live. “While we often focus on GPUs and what Nvidia is doing in that space, it misses looking at the need for CPUs that manage all the workloads and do a significant portion of processing beyond just AI acceleration”.

He explained Google realises it needs to have powerful CPUs to do much of the cloud workloads it hosts, even as the tech giant looks to use some of its own custom CPUs for work on less intensive tasks.

Gold stated X86 architecture still rules data centres which means Intel chips are in demand, as are AMD’s x86 CPUs.

“That’s good news for Intel, who has already stated they are supply constrained in how many CPUs they can supply to the market because of high demand”, he said.

Intel is manufacturing its latest Xeon chips using its advanced 18A process at its fab in the U.S. state of Arizona.

IPUs

The two companies will also deepen co‑development of custom ASIC‑based infrastructure processing units (IPUs). The move comes despite Google’s growing use of in‑house chips, including its TPU AI accelerators.

The IPUs offload networking, storage and security tasks from host CPUs, improving efficiency, utilisation and predictability at hyperscale.

“Cloud system efficiency also needs to manage things like interconnect, storage and power management”, Gold explained. “For best efficiency, you need a custom designed IPU that is tailored to the specific infrastructure designs of the hyperscaler data centre.”

Gold stated IPUs are important for Intel because it is looking to fill its fabs as the manufacturing group aims to be a mass provider of chips to outside customers. He noted the custom IPU will help Intel significantly increase volumes through its fabs and become more profitable.

“Bottom line, this announcement is good news for Google because it gets needed processing components for its cloud business. And it’s also good news for Intel in that it maintains its relations with a major cloud producer while also filling its fabs”.

Financial terms and the length of the expanded collaboration are not disclosed.

Source: Mobile World Live

Image Credit: Stock Image

Previous ArticleNext Article

GET TAHAWULTECH.COM IN YOUR INBOX

The free newsletter covering the top industry headlines