Wednesday, March 18, 2026

Nvidia’s Vera Rubin: The Future of AI with 22TB/s Bandwidth and HBM4 Memory

Forecasts suggest that only the sixth-generation High Bandwidth Memory (HBM4) from SK Hynix and Samsung Electronics will be integrated into Nvidia’s next-generation superchip, Vera Rubin. With Micron seemingly out of the HBM4 supply chain, the two South Korean semiconductor giants are poised to divide the HBM market for Vera Rubin.

On February 7, semiconductor analysis firm SemiAnalysis reported, “There is no evidence of Nvidia placing HBM orders with Micron,” projecting Micron’s HBM share in Vera Rubin to plummet to 0%. They forecast SK hynix capturing roughly 70% of the HBM4 supply, with Samsung Electronics securing the remaining 30%.

The Vera Rubin system integrates 36 Vera CPUs and 72 Rubin GPUs into a single powerhouse. It boasts a fivefold increase in inference performance compared to current Blackwell-based products, reduces per-token costs by 90%, and requires only a quarter of the GPUs for model training. This system is expected to be released late this summer in the form of the VR200 NVL72 rack-scale solution.

Engineered for large-scale AI model operations, Vera Rubin hinges on HBM4-based ultra-high-bandwidth memory. Some reports suggest that Micron, unlike its Korean counterparts, failed to secure HBM4 orders for Vera Rubin.

However, industry watchers suggest Micron might offset this setback by supplying LPDDR5X memory. Some observers speculate that Micron could provide substantial LPDDR5X memory for the Vera CPUs within the Vera Rubin ecosystem.

Insiders point to Nvidia’s aggressive memory specification upgrades as a key factor in supplier selection. Nvidia initially targeted a 13TB/s memory bandwidth for the VR200 NVL72 system in March 2025, but dramatically raised this to 20.5TB/s by September. At CES 2026, they revealed the system operating at 22TB/s – a near 70% jump from the original goal.

This specification escalation severely tests memory manufacturers’ technological prowess and yield management. Currently, only Samsung and SK hynix are deemed capable of meeting the stringent HBM4 requirements.

Hot this week

Unlocking the Power of Intel Core Series 2: A Comprehensive Guide to Edge AI Solutions

Intel unveils Core Processor Series 2 and AI suite for healthcare, enhancing edge computing performance and reliability in industrial settings.

Samsung’s Record R&D Investment: How it is Shaping the Future of AI and Semiconductors

Samsung Electronics invested a record $25.33 billion in R&D to lead in AI and semiconductors, boosting its future tech capabilities.

How Rising Fuel Prices Impact Asian Airlines: A Comparison of FSC vs. LCC

Low-cost carriers are struggling to cope with rising fuel prices, lacking effective hedging strategies unlike major airlines.

SK Group Invests 630 Million USD AI Company

SK Group invests heavily in AI, aiming to transform into a leader in the AI market through a new U.S. investment firm.

Nvidia’s Jensen Huang Predicts 1000x Surge in AI Computing Demand

Nvidia's CEO highlights surging AI computing demand due to AI agents like OpenClaw, boosting memory chip demand from firms like Samsung.

Topics

Unlocking the Power of Intel Core Series 2: A Comprehensive Guide to Edge AI Solutions

Intel unveils Core Processor Series 2 and AI suite for healthcare, enhancing edge computing performance and reliability in industrial settings.

Samsung’s Record R&D Investment: How it is Shaping the Future of AI and Semiconductors

Samsung Electronics invested a record $25.33 billion in R&D to lead in AI and semiconductors, boosting its future tech capabilities.

How Rising Fuel Prices Impact Asian Airlines: A Comparison of FSC vs. LCC

Low-cost carriers are struggling to cope with rising fuel prices, lacking effective hedging strategies unlike major airlines.

SK Group Invests 630 Million USD AI Company

SK Group invests heavily in AI, aiming to transform into a leader in the AI market through a new U.S. investment firm.

Nvidia’s Jensen Huang Predicts 1000x Surge in AI Computing Demand

Nvidia's CEO highlights surging AI computing demand due to AI agents like OpenClaw, boosting memory chip demand from firms like Samsung.

How Samsung’s Galaxy S26 Series Leverages India’s R&D for Innovation

Samsung's R&D centers in India are key to developing the Galaxy S26, highlighting India's role in global tech innovation.

GPT-5.4 vs. GPT-5.2: What Makes OpenAI’s Latest Model a Game Changer?

OpenAI launched GPT-5.4, an AI model enhancing professional tasks with improved coding, reasoning, and tool integration capabilities.

Apple MacBook Air M5: Is the Price Increase Worth the AI Performance Boost?

Apple launches the MacBook Air with M5 chip, doubling storage and enhancing AI capabilities, starting from 1,223 USD.

Related Articles