

preCharge News TECH — Nvidia CEO Jensen Huang unveiled a series of major announcements at Computex 2025 in Taiwan, reinforcing the company’s position at the center of artificial intelligence development. The highlight was NVLink Fusion, a new technology that allows customers to connect Nvidia’s graphics processing units (GPUs) with non-Nvidia central processing units (CPUs) and application-specific integrated circuits (ASICs).
Breaking Down NVLink Fusion
From Closed to Open Integration
Until now, NVLink was a proprietary technology, restricted to Nvidia’s own chips. The NVLink Fusion platform changes that by allowing external chipmakers to integrate their CPUs with Nvidia’s GPUs. This shift could reshape the AI hardware landscape, opening Nvidia’s ecosystem to a wider range of technologies.
“NVLink Fusion is so that you can build semi-custom AI infrastructure, not just semi-custom chips,” Huang said during his keynote. This approach allows for tighter integration between CPUs and GPUs, making it easier for companies to build custom AI systems. Nvidia’s partners in this initiative include MediaTek, Marvell, Alchip, Astera Labs, Synopsys, and Cadence.
Why NVLink Fusion Matters
Expanding Nvidia’s Data Center Reach
The NVLink Fusion platform aims to capture a share of the AI data center market currently dominated by specialized ASICs. Many competitors, including Google, Microsoft, and Amazon, have developed their own custom processors for specialized AI workloads, challenging Nvidia’s dominance.
Industry analyst Ray Wang told preCharge News that NVLink Fusion “consolidates Nvidia as the center of next-generation AI factories—even when those systems aren’t built entirely with Nvidia chips.” This move is strategic, as it positions Nvidia to serve customers that may not build fully Nvidia-based systems but still want to integrate its powerful GPUs.
Potential Risks and Market Impact
Balancing Flexibility and Market Share
While NVLink Fusion broadens Nvidia’s market reach, it also introduces potential risks. By allowing customers to integrate third-party CPUs, Nvidia risks cannibalizing its own Grace CPU line. However, Rolf Bulk, an equity research analyst at New Street Research, believes the platform’s flexibility makes Nvidia’s overall ecosystem more competitive, helping it fend off emerging rivals.
“At the system level, the added flexibility improves the competitiveness of Nvidia’s GPU-based solutions versus alternative emerging architectures, helping Nvidia to maintain its position at the center of AI computing,” Bulk said.
Notably, Nvidia’s competitors Broadcom, AMD, and Intel are not part of the NVLink Fusion ecosystem, potentially limiting the platform’s reach.
Other Major Announcements
New Hardware, Cloud Solutions, and Taiwan Expansion
In addition to NVLink Fusion, Huang announced the next generation of Grace Blackwell systems for AI workloads. The upcoming GB300, set to launch in the third quarter of 2025, promises higher system performance, reinforcing Nvidia’s leadership in AI hardware.
Nvidia also introduced the DGX Cloud Lepton, a new AI platform designed to connect developers with tens of thousands of GPUs through a global network of cloud providers. This platform aims to streamline access to powerful AI compute resources, addressing a critical bottleneck in the industry.
The company also announced plans for a new office in Taiwan, where it will partner with Foxconn to build an AI supercomputer. “We are delighted to partner with Foxconn and Taiwan to help build Taiwan’s AI infrastructure, and to support TSMC and other leading companies to advance innovation in the age of AI and robotics,” Huang said.
Want to earn some extra money on the side? Buy PCPi Coin or Subscribe to VIP and get dividens monthly.
____
Associated Press, CNBC News, Fox News, and preCharge News contributed to this report.