In recent developments reported by South Korean media, SK Hynix Vice President Chun-hwan Kim has officially announced the commencement of mass production for the HBM3E memory. Moreover, plans are in place to initiate large-scale production of HBM4 by the year 2026.
Vice President Chun-hwan Kim noted, "In light of the AI computing era's emergence, generative artificial intelligence (AI) is undergoing rapid advancements, projecting an anticipated annual market growth rate of 35%." The accelerated expansion of the generative AI market necessitates a substantial uptick in high-performance AI chips, consequently driving heightened demand for advanced, high-bandwidth memory chips.
SK Hynix's HBM3E, with a 9.6 GT/s data transfer rate on a 1024-bit interface, has officially entered mass production, offering a theoretical peak bandwidth of 1.2 TB/s per stack. Notably, in response to the escalating demands from the artificial intelligence and high-performance computing (HPC) industries, the industry focus is now shifting towards the development of the next-generation HBM4 memory, featuring a 2048-bit interface.
Chun-hwan Kim explicitly affirmed that SK Hynix is poised to commence HBM4 production in 2026, anticipating a significant catalyst for growth in the AI market. Recognizing the substantial demand within the HBM industry, Kim emphasized the critical importance of establishing a seamless and innovative solution. With market projections indicating a 40% growth in the HBM sector by 2025, SK Hynix positions itself strategically to capitalize on this burgeoning market.
Simultaneously, leading industry players such as Micron and Samsung are gearing up for HBM4 production in 2026. Micron has outlined that HBM4 will leverage a 2048-bit interface, elevating the theoretical peak memory bandwidth to over 1.5 TB/s per stack. This pursuit, with a transfer rate of approximately 6 GT/s, is instrumental in managing power consumption for the forthcoming generation of DRAM. However, it is essential to note that the intricacies of interposer wiring or the placement of HBM4 stacks atop the chip may contribute to a higher cost compared to both HBM3 and HBM3E.
During the latest earnings conference call with analysts and investors, Samsung's Executive Vice President of Memory, Jaejune Kim, stated, "HBM4 is currently in the developmental phase, with samples expected in 2025 and mass production slated for 2026. Fueled by the momentum of generative AI, the demand for customized HBM is on the rise. Therefore, our focus extends beyond standard products, with a commitment to developing performance-optimized custom HBMs for individual clients. Detailed specifications are actively being discussed with key stakeholders."