Recent developments in the memory market have been significantly influenced by reduced production from major memory chip manufacturers over the past year. This, coupled with a modest uptick in end-user demand, has resulted in concurrent price increases for DRAM and NAND Flash memory.
Amidst the surge in artificial intelligence (AI) technology, High Bandwidth Memory (HBM) has emerged from the background and now takes center stage, with sustained growth in market demand. Global market research firm TrendForce predicts a 58% increase in HBM demand for 2023, with an additional 30% growth expected in 2024.
HBM offers a range of advantages, including high bandwidth, high capacity, low latency, and low power consumption, making it highly suitable for high-performance computing scenarios, such as those involving advanced AI applications like ChatGPT. Consequently, HBM technology is gaining favor, with major memory manufacturers actively promoting its development.
Since the inception of the first silicon interposer-based HBM product in 2014, the technology has seen multiple iterations, including HBM, HBM2, HBM2E, HBM3, and HBM3e.
Regarding original equipment manufacturer (OEM) strategies, a prior survey by TrendForce revealed that two major South Korean companies, SK hynix and Samsung, have focused on the development of HBM3. Their representative products include NVIDIA H100/H800 and AMD's MI300 series. These manufacturers anticipate sampling HBM3e in the first quarter of 2024. In contrast, U.S.-based Micron has opted to leapfrog HBM3 and directly develop HBM3e.
HBM3e is set to feature 24Gb mono die stacks and, with an 8-layer (8Hi) configuration, each HBM3e chip will boast a substantial capacity of 24GB. This advancement is expected to be incorporated into NVIDIA's GB100 in 2025. Consequently, the three major OEMs are planning to release HBM3e samples in the first quarter of 2024, with full-scale production anticipated in the latter half of the following year.
Additionally, recent reports suggest that leading memory manufacturers are in the planning stages for the next-generation HBM technology, known as HBM4.
Notably, Samsung Electronics' Vice President and Head of the DRAM Product and Technology Team, Huang Shangjun, has announced the successful development of 9.8Gbps HBM3e and plans to offer samples to customers. Concurrently, Samsung is actively working on HBM4, with the objective of supplying it by 2025. Samsung Electronics is reported to be developing non-conductive adhesive (NCF) assembly techniques optimized for high-temperature characteristics and hybrid bonding (HCB) technologies for application in HBM4 products.
Reports from South Korean media in September indicated that Samsung intends to significantly revamp its production processes to capitalize on the rapidly expanding HBM market and introduce HBM4 products. It's important to note that HBM4 memory stacks will feature a 2048-bit memory interface, a notable advancement compared to the prior 1024-bit interface in all previous HBM stacks, underscoring the transformative potential of HBM4.
While HBM4 promises significant advancements, its widespread adoption is not imminent, and it is too early to discuss its practical applications and market penetration. The industry suggests that the current HBM market is primarily dominated by HBM2e, with HBM3 and HBM3e poised to take the lead in the future.
In terms of the demand for various HBM generations, according to TrendForce, 2023 is expected to witness a shift in primary demand from HBM2e to HBM3, with estimated proportions of approximately 50% and 39%, respectively. As the utilization of HBM3-based accelerator chips continues to grow, the market demand is projected to shift significantly towards HBM3 in 2024, surpassing HBM2e and accounting for an estimated 60% share. This transition is expected to drive substantial revenue growth for HBM in the coming year, driven by its higher average selling price (ASP).