Welcome to SmBom Tesco!
All Products

HBM Demand Soars: SK Hynix and Micron Sold Out!

2024-02-24
twitter photos
facebook photos
instagram photos

HBM Demand Soars: SK Hynix and Micron Sold Out!

NVIDIA has reported exceptional financial performance in its latest Q4 financial statement. The company witnessed an extraordinary 265% year-on-year surge in revenue, reaching $22.1 billion, accompanied by a remarkable 769% increase in net profit, totaling $12.285 billion. This unprecedented performance has propelled NVIDIA's market capitalization beyond the $2 trillion mark, setting new historical records.

The High Bandwidth Memory (HBM), a pivotal component in contemporary AI chipsets, is experiencing sustained high demand. Following Micron's recent announcement of fully exhausted HBM capacity, it has come to light that SK Hynix has also successfully sold out its HBM capacity for the current year.

SK Hynix, in its 2023 financial report, highlighted substantial revenue growth in the DRAM sector, with a fourfold increase in DDR5 DRAM and over a fivefold increase in HBM3 compared to the previous year. Vice President Kim Ki-tae recently communicated that SK Hynix's HBM inventory for the year 2024 has already been completely sold out. The company is gearing up for 2025 to maintain its market leadership. The escalating demand for AI applications in devices like PCs and smartphones is anticipated to boost sales of HBM3E, DDR5, LPDDR5T, and other related products.

During a December earnings conference, Micron's CEO Sanjay Mehrotra disclosed that due to the fervor surrounding generative AI, Micron's HBM capacity for 2024 is projected to be entirely sold out. The HBM3E, introduced at the beginning of 2024, is expected to generate significant revenue in the fiscal year.

Earlier indications from SK Hynix pointed towards the release of HBM3E in the first half of the current year. Recent reports confirm the successful completion of HBM3E's development in mid-January, with mass production scheduled to commence in March, delivering the initial batch to NVIDIA in April. In comparison, competitors Samsung and Micron have provided HBM3E samples to NVIDIA, but their final product quality certification tests are set to begin in March.

SK Hynix's HBM3E boasts a 9.6 GT/s data transfer rate on a 1024-bit interface, providing a theoretical peak bandwidth of 1.2 TB/s per memory stack. The upcoming HBM4, featuring a 2048-bit interface, aims to achieve a peak memory bandwidth of over 1.5 TB/s per stack. The intricate wiring and higher costs associated with HBM4, compared to HBM3 and HBM3E, are expected due to the required 6 GT/s data transfer rate. SK Hynix has initiated the development of HBM4, with all three major HBM manufacturers planning mass production in 2026.

Rumors suggest that SK Hynix plans to establish an advanced packaging facility in Indiana, USA, focusing on 3D stacking processes for HBM. This facility aims to integrate with NVIDIA's AI GPUs and potentially explore stacking on top of the main chip, providing significant support.

According to market research firm Gartner, global HBM revenue is projected to double from $2.05 billion in 2023 to $4.976 billion in 2025, with a remarkable growth rate of 148.2%. AI GPU products currently dominate HBM consumption, with substantial growth expected in FPGA usage after 2025, driven by inference model deployment and applications.

In terms of suppliers, SK Hynix, Samsung, and Micron stand as the three global HBM suppliers, with SK Hynix holding a 50% market share, Samsung holding 40%, and Micron holding 10% in the 2022 HBM market.

 

Solemnly declare: The copyright of this article belongs to the original author. The reprinted article is only for the purpose of disseminating more information. If the author's information is marked incorrectly, please contact us to modify or delete it as soon as possible. Thank you for your attention!