Micron has unveiled the latest advancements in its next-generation HBM4 and HBM4E technologies, with plans to begin mass production in 2026. HBM4 is expected to deliver cutting-edge performance and efficiency, marking a significant step forward in enhancing artificial intelligence (AI) computing capabilities. Along with industry leaders like SK Hynix and Samsung, Micron is competing for dominance in the HBM4 market. During the company's latest investor meeting, Micron confirmed that its HBM4 development is on track, while work on HBM4E has already begun.
Leveraging a solid foundation in mature 1β process technology and continued investment, Micron anticipates that its HBM4 will lead in both time-to-market and energy efficiency, offering over a 50% performance boost compared to the previous HBM3E generation. The company plans to launch HBM4 in mass production by 2026.
HBM4E, which will follow shortly after the release of HBM4, will introduce a shift in Micron's memory business. By adopting TSMC's advanced logic foundry manufacturing processes, HBM4E will offer select clients the option of custom logic chips, marking a strategic transformation for Micron. This customization capability is expected to boost Micron's financial performance.
One of the revolutionary aspects of HBM4 is the industry's move toward integrating memory and logic semiconductors into a single package. This integration eliminates the need for traditional packaging technologies, improving performance and efficiency by bringing the chips closer together. This is the reason Micron has chosen TSMC as its "logic semiconductor" partner, similar to the approach used by SK Hynix.
Micron also highlighted the significance of the HBM4E process, becoming one of the first companies to disclose developments alongside SK Hynix. While specific details about the HBM4 product line remain undisclosed, Micron has revealed that HBM4 will stack up to 16 DRAM chips, each with a capacity of 32GB, and feature a 2048-bit interface, making it a superior technology compared to its predecessors.
In terms of applications, HBM4 is expected to be integrated with AMD's Instinct MI400 series and NVIDIA's Rubin AI architecture. With current demand for HBM memory at its peak, Micron has also disclosed significant production line bookings through 2025, signaling an even brighter future for the technology.