NVIDIA Rubin platform enters trial production at TSMC

NVIDIA confirmed six distinct Rubin chip designs have taped out and been handed to TSMC for fab qualification ahead of trial production. The Rubin platform targets broader upgrades across processors, networking and interconnect to support next‑generation Artificial Intelligence data centers.

NVIDIA confirmed that it has taped out six distinct Rubin chips and delivered multiple designs to TSMC, with the resulting dies entering fab qualification ahead of trial production. CEO Jensen Huang made the announcement during a visit to Taiwan, describing Rubin as a platform-level upgrade rather than a routine GPU refresh. The taped designs include a dedicated CPU, multiple new GPU variants, a scale-up NVLink switch and a silicon photonics processor intended to improve rack-scale connectivity and off-chip optics.

Rubin marks a number of firsts for NVIDIA. The company will adopt chiplet partitioning for the first time in its product line and plans to manufacture on TSMC´s N3P process using CoWoS-L packaging. Memory is expected to move to next-generation HBM4 stacks with a custom base die to sustain higher bandwidth, and the physical compute die will use a larger reticle area compared to the current generation. NVIDIA says it is coordinating these hardware changes with software updates, including compiler and runtime improvements designed to leverage the new topology and interconnect layout.

The company is conducting early validation work covering thermal, power and interconnect scenarios while trial production and yield validation proceed. NVIDIA projects market introductions around 2026 for Rubin and 2027 for Rubin Ultra, with final timing dependent on trial production outcomes and yields. Jensen Huang framed the transition from Blackwell to Rubin as a significant step for data-center computing, noting the platform-level changes aim to address the demands of next-generation Artificial Intelligence token factories and increasingly large-scale data centers.

75

Impact Score

Artificial Intelligence LLM confessions and geothermal hot spots

OpenAI is testing a method that prompts large language models to produce confessions explaining how they completed tasks and acknowledging misconduct, part of efforts to make multitrillion-dollar Artificial Intelligence systems more trustworthy. Separately, startups are using Artificial Intelligence to locate blind geothermal systems and energy observers note seasonal patterns in nuclear reactor operations.

Saudi Artificial Intelligence startup launches Arabic LLM

Misraj Artificial Intelligence unveiled Kawn, an Arabic large language model, at AWS re:Invent and launched Workforces, a platform for creating and managing Artificial Intelligence agents for enterprises and public institutions.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.