Part #/ Keyword
All Products

Samsung to Raise HBM Output 50% for HBM4

2026-01-05 10:41:13Mr.Ming
twitter photos
twitter photos
twitter photos
Samsung to Raise HBM Output 50% for HBM4

As the global AI race accelerates, Samsung Electronics has reached a key internal decision to sharply expand its high-bandwidth memory (HBM) manufacturing capacity. The company plans to lift monthly HBM output by roughly 50% by the end of 2026, a move aimed squarely at supporting NVIDIA's next-generation HBM4 platform and strengthening its position in future AI chip programs.

Industry sources indicate that Samsung intends to raise HBM capacity to around 250,000 12-inch wafers per month by late 2026, up from the current level of about 170,000 wafers. This represents an increase of approximately 47%. The expansion is expected to be executed through two parallel approaches: converting part of Samsung's existing DRAM production lines to HBM manufacturing, and building new dedicated HBM lines at its Pyeongtaek P4 facility. Equipment vendors anticipate that procurement and installation of key tools could begin as early as January 2026.

HBM has become an increasingly central part of Samsung's memory strategy as demand from AI accelerators continues to surge. However, the company has remained cautious in its public messaging. A Samsung Electronics spokesperson stated that the company is "reviewing multiple options to address rapidly growing HBM demand," while declining to confirm specific investment details.

Behind this expansion plan lies both urgency and opportunity. In October 2025, NVIDIA officially confirmed that Samsung's HBM4 would be used in its upcoming AI products, with the memory scheduled to debut alongside NVIDIA's next-generation "Rubin" AI chips in the second half of 2026. According to industry feedback, Samsung's HBM4 has delivered strong performance in Rubin test silicon, earning positive evaluations.

At the same time, sustained global investment in AI infrastructure continues to drive unprecedented demand for HBM, making long-term supply readiness a strategic priority. By moving early and scaling aggressively, Samsung aims to secure sufficient HBM4 output and gain an advantage as competition for NVIDIA's future AI platforms intensifies. Industry insiders note that the current round of Samsung's memory investment is heavily focused on HBM4, underscoring the company's long-term bet on advanced AI memory.

* Solemnly declare: The copyright of this article belongs to the original author. The reprinted article is only for the purpose of disseminating more information. If the author's information is marked incorrectly, please contact us to modify or delete it as soon as possible. Thank you for your attention!