Memory crunch won’t hit NVIDIA: Jensen highlights initial exclusive HBM4, high H200 demand

NVIDIA does not expect memory shortages to affect operations and, as the exclusive user of HBM4 for the time being, the company stands to reap the benefits.

author-image
DQI Bureau
New Update
nvidia-rubin-1024x544
Listen to this article
0.75x1x1.5x
00:00/ 00:00

At CES 2026, NVIDIA officially unveiled its Rubin platform with next-generation HBM, drawing attention to CEO Jensen Huang’s take on the memory crunch. According to Chosun Biz and Hankyung, he said NVIDIA does not expect memory shortages to affect operations and noted that, as the exclusive user of HBM4 for the time being, the company stands to reap the benefits.

Advertisment

Huang, as per Hankyung, highlighted that NVIDIA is not one of the world’s largest memory buyers, but will be the sole client for HBM4 for quite a long time. The Chosun Biz, citing Huang, notes that NVIDIA is collaborating with every memory partner, and as the initial and sole customer for HBM4, Team Green will be the beneficiary for now.

As SeDaily reported in December, both Samsung Electronics and SK hynix have begun delivering paid final HBM4 samples to NVIDIA, with the industry expecting volumes and pricing to be locked in during the first quarter of 2026.

Hankyung, citing Huang, noted that NVIDIA buys not only HBM, but also graphics DRAM (GDDR) and low-power DRAM (LPDDR), sourcing them from the industry’s three major memory suppliers — Samsung Electronics, SK hynix, and Micron. The report adds that Samsung is recognized as its largest GDDR supplier.

Advertisment

While Huang dismissed concerns that the memory crunch would affect the AI chip giant, he did flag broader supply challenges. Bloomberg reports that he described storage as “a completely unserved market today,” adding that his CES remarks suggest demand for NAND storage will stay robust across NVIDIA’s systems.

Investing.com reports that Huang highlighted how AI-driven storage demand is already exceeding the capacity of existing infrastructure. “The amount of context memory, the amount of token memory that we process, KB cache we process, is now just way too high. You’re not going to keep up with the old storage system,” he noted.

Robust H200 demand in China
Regarding the AI chip H200, Huang, according to CNBC, said that customer interest in China for its H200 AI chips is running at “very high” levels, after U.S. authorities recently signaled they would allow the product to be exported.

The report, citing Huang, suggests NVIDIA has resumed production of the chips and is finalizing the remaining export licensing details with the U.S. government. He also noted that the company has reactivated its supply chain, and H200 chips are now moving through the pipeline.

However, Reuters notes that Huang emphasized Chinese clearance for the H200 will be reflected in incoming purchase orders rather than through any formal announcement.

As Reuters previously reported, NVIDIA has turned to TSMC to boost production, with work expected to ramp up in the second quarter of 2026. The report added that Chinese companies have placed orders for over 2 million H200 chips for 2026, while NVIDIA currently has only around 700,000 units in inventory. The H200, based on NVIDIA’s previous-generation Hopper architecture, is reportedly built using TSMC’s 4nm process.

Source: TrendForce, Taiwan.

nvidia memory HBM