AEI

ASIA ELECTRONICS INDUSTRYYOUR WINDOW TO SMART MANUFACTURING

NVIDIA Expands AI Infrastructure With Marvell Alliance

NVIDIA is expanding its AI ecosystem through a strategic partnership with Marvell Technology, a move aimed at giving enterprise and hyperscale customers greater flexibility in building next‑generation AI infrastructure. The collaboration brings Marvell into NVIDIA’s AI factory and AI‑RAN ecosystem via NVIDIA NVLink Fusion, while also deepening cooperation on advanced networking and silicon photonics technologies.

At the core of the partnership is NVLink Fusion, NVIDIA’s rack‑scale platform designed to support semi‑custom AI infrastructure. Through this integration, Marvell will deliver custom XPUs and NVLink Fusion‑compatible scale‑up networking, while NVIDIA contributes its broader technology stack, including CPUs, networking, interconnects, data processing units, and AI compute platforms.

The result is a heterogeneous AI infrastructure that remains fully compatible with NVIDIA systems, enabling customers to blend custom silicon with NVIDIA GPUs, networking, and storage platforms.

NVIDIA’s Endeavor building in Santa Clara, California, United States (Image Credit: NVIDIA)

Strategic Investment

The collaboration addresses a growing enterprise demand for choice and architectural flexibility as AI workloads scale rapidly. As organizations race to build “AI factories” to support large‑scale inference and token generation, the ability to integrate specialized compute with a proven AI ecosystem has become a strategic priority. NVIDIA positions NVLink Fusion as a way for customers to innovate at the silicon level without sacrificing ecosystem compatibility or supply‑chain scale.

Beyond data centers, NVIDIA and Marvell are also extending their partnership into telecommunications. The companies plan to work together on NVIDIA Aerial AI‑RAN, targeting the transformation of 5G and future 6G networks into AI‑native infrastructure. In parallel, they will collaborate on advanced optical interconnect solutions and silicon photonics, areas seen as critical to scaling AI performance while managing power and latency constraints.

As part of the expanded relationship, NVIDIA has made a US$2 billion investment in Marvell, underscoring the strategic importance of high‑speed connectivity, custom silicon, and optical technologies in the next phase of AI infrastructure. For enterprise and service‑provider leaders, the partnership signals a continued shift toward more open, modular, and scalable AI system design anchored in NVIDIA’s rapidly expanding ecosystem.

“The inference inflection has arrived. Token generation demand is surging, and the world is racing to build AI factories,” said Jensen Huang, founder and CEO of NVIDIA. “Together with Marvell, we are enabling customers to leverage NVIDIA’s AI infrastructure ecosystem and scale to build specialized AI compute.”

Matt Murphy, Chairman and CEO of Marvell said, “By connecting Marvell’s leadership in high-performance analog, optical DSP, silicon photonics and custom silicon to NVIDIA’s expanding AI ecosystem through NVLink Fusion, we are enabling customers to build scalable, efficient AI infrastructure.”

02 April 2026