Nvidia AI networking is now central to the company’s future as it reclaims the title of the world’s most valuable company. Whether it maintains that top spot will depend largely on how well it builds and controls the infrastructure for AI’s next era. CEO Jensen Huang envisions a future where applications run across a distributed system—part in the cloud, part at the edge, and part in autonomous machines.
In a conversation with Ethernet inventor Bob Metcalfe, Huang explained that the evolution of software should drive hardware design. He described data centers as “composable disaggregated infrastructures,” where computing nodes must constantly exchange information over high-performance networks. That thinking led directly to Nvidia’s $6.9 billion acquisition of Mellanox in 2019.
Founded in Israel in 1999, Mellanox initially developed high-speed, low-latency networking products using InfiniBand. Eventually, the company expanded into Ethernet solutions. Nvidia saw this evolution as strategic and acted accordingly. Mellanox’s expertise now powers Nvidia’s networking division, led by Kevin Deierling, Mellanox’s first U.S. employee.
Deierling helped launch Spectrum-X, Nvidia’s purpose-built Ethernet platform for AI. Unlike traditional cloud workloads, AI requires moving massive synchronized data sets between GPU nodes. “AI creates elephant flows,” Deierling explained, referring to the enormous, bursty traffic patterns that AI tasks generate.
Furthermore, AI’s transition from training to inference is intensifying pressure on networks. Training requires vast data sets and time, but inference involves using models in real-time, often across shared infrastructure. This shift demands both speed and scalability. Spectrum-X addresses these needs by combining the familiarity of Ethernet with the speed of InfiniBand.
Moreover, Ethernet’s widespread use gives Nvidia a significant advantage. Most companies already use Ethernet, so adapting to Spectrum-X requires fewer operational changes. “We’ve built on standards customers already know,” Deierling said. “But we’ve optimized everything under the hood for AI.”
The platform’s real-world impact is already clear. Nvidia’s Spectrum-X powers the world’s largest AI supercomputer, showing that the technology scales globally. Additionally, faster, more efficient data movement within the data center increases profitability. “Service providers care about performance per dollar and per watt,” said Deierling. Spectrum-X helps them maximize both metrics.
Another advantage lies in customization. Data centers can adjust performance levels to meet user needs and pricing tiers. This flexibility boosts efficiency and competitiveness. It also supports advanced use cases, like personalizing AI agents with proprietary data.
This personalization reduces errors and enhances compliance. As enterprises build AI agents that understand their domain, they gain accuracy while respecting data privacy laws. As Deierling noted, “It’s a smart way to reduce hallucinations in AI outputs.”
At the same time, Nvidia sees growing demand at the network’s edge. Deierling highlighted the rise of physical AI—robotics, autonomous vehicles, and mobile sensors. These systems require real-time connectivity across a distributed infrastructure. Ethernet provides the thread that ties together cloud, edge, and endpoint computing.
Back in 1984, Sun Microsystems coined the phrase, “The Network is the Computer.” Nvidia is proving that this statement holds true for AI. Its founders once imagined flight simulators and 3D games as the future of computing. Today, Huang believes AI marks a new paradigm—“a new way of writing software.”
Consequently, Nvidia has built the infrastructure that allows developers to create for this new world. Its success comes from insight, strategy, and timing. “We found ourselves at the right place at the right time,” Huang told Metcalfe. Nvidia’s AI networking bet appears to be paying off—and it could define the next generation of computing.
READ: NVIDIA AI Europe 2025: Building Sovereign AI Infrastructure