On April 19, 2026 (local time), media reports citing sources familiar with the matter revealed that Google, a subsidiary of Alphabet, is in talks with Marvell Technology, a U.S.-based fabless semiconductor design company, to develop two new AI chips. The goal is to run artificial intelligence models more efficiently, improve inference performance, and reduce computing costs.
The collaboration centers on two chips: first, a Memory Processing Unit (MPU), which will work in tandem with Google’s existing Tensor Processing Units (TPUs) to offload memory-intensive computational tasks, alleviating TPU bottlenecks in memory bandwidth and thereby improving system efficiency in high-concurrency inference scenarios. The two companies plan to complete the design and enter the pilot production phase as early as 2027. The second is a dedicated TPU optimized for inference scenarios, with the goal of achieving a better cost-performance ratio and reducing the computational cost per AI response. Google’s current flagship AI accelerator is the TPU v7 (codenamed Ironwood), and this collaboration aims to further optimize inference performance.
From a strategic perspective, Google is systematically building a diversified supply chain for custom chips, having previously partnered with companies such as Broadcom and MediaTek. Marvell’s addition expands Google’s design partners from two to three and introduces a new chip category: the MPU. As the user base for AI continues to grow, inference costs have become the fastest-growing component of data center operating expenses. Google hopes to reduce these costs and improve the return on investment for AI through custom chips.
Marvell has extensive experience in the field of custom cloud chips, having provided design services to cloud providers such as Amazon, Microsoft, and Meta. In this partnership, Marvell will assume a design service role, similar to its collaboration model in Google’s Axion ARM CPU project. Recently, Marvell also entered into a strategic partnership with NVIDIA, which involves a $2 billion investment and deep integration of its chips and networking products, further strengthening Marvell’s position in the AI chip market.
Market analysts believe that this move reflects a core shift in Google’s AI chip strategy—a gradual transition from prioritizing training performance to emphasizing inference efficiency, in order to address the challenges posed by rapidly rising inference costs. This collaboration marks a significant step in Google’s diversified strategy within the AI chip sector, aiming to optimize inference performance and reduce operational costs through custom chips, thereby solidifying its competitive edge in the AI infrastructure space.
Email: Info@ariat-tech.comHK TEL: +852 30501966ADD: Rm 2703 27F Ho King Comm Center 2-16,
Fa Yuen St MongKok Kowloon, Hong Kong.