Part #/ Keyword
All Products

Samsung and SK hynix to Raise HBM3E Prices by 20%

2025-12-26 11:01:07Mr.Ming
twitter photos
twitter photos
twitter photos
Samsung and SK hynix to Raise HBM3E Prices by 20%

Recent reports indicate that Samsung Electronics and SK hynix have raised their projected HBM3E pricing for 2026 by nearly 20%, following the U.S. government's decision under the Trump administration to allow NVIDIA's H200 AI accelerator to be shipped to China.

Industry sources describe the price increase as unusual. HBM3E remains the dominant high-bandwidth memory product this year, and pricing was widely expected to soften or stabilize once HBM4 enters mass production next year. However, ongoing shipments of AI accelerators based on HBM3E by major chipmakers such as NVIDIA and AMD are keeping demand on a steady upward trajectory.

A key driver is NVIDIA's renewed access to the Chinese market for its H200 accelerator. Each H200 requires six stacks of HBM3E, pushing memory demand well beyond earlier expectations. According to sources cited by Reuters, NVIDIA plans to fulfill its initial orders using existing inventory, with an estimated shipment of 5,000 to 10,000 eight-GPU servers. This translates to roughly 40,000 to 80,000 H200 chips. At the same time, NVIDIA is preparing to expand production capacity, with additional manufacturing orders expected to be placed in the second quarter of 2026.

Demand for HBM3E is also rising among large cloud and AI platform operators. Google's Tensor Processing Units (TPUs) and Amazon's Trainium accelerators both rely on HBM3E and are scheduled to enter volume shipments in 2026. As AI workloads grow more complex, each new generation of accelerator is expected to increase HBM capacity by around 20–30% compared with earlier designs. Google's seventh-generation TPU is reported to integrate eight HBM3E stacks per chip, while Amazon's Trainium3 is said to use four stacks.

Meanwhile, Samsung Electronics and SK hynix are prioritizing a faster ramp-up of next-generation HBM4, leaving HBM3E supply tight and consistently trailing demand. Analysts estimate that next year's HBM revenue mix will shift to approximately 45% from HBM3E and 55% from HBM4, highlighting the rapid transition underway in the AI memory market.

* Solemnly declare: The copyright of this article belongs to the original author. The reprinted article is only for the purpose of disseminating more information. If the author's information is marked incorrectly, please contact us to modify or delete it as soon as possible. Thank you for your attention!