On June 10, Micron Technology announced that it has begun delivering 36GB 12-high HBM4 (High Bandwidth Memory) samples to several leading partners in the AI industry. This milestone reinforces Micron's leadership in memory performance and energy efficiency for next-generation artificial intelligence platforms.
The new HBM4 memory leverages Micron's mature 1ß (1-beta) DRAM process, a proven 12-high advanced packaging architecture, and integrated Memory Built-In Self-Test (MBIST) capabilities. Together, these innovations enable seamless integration for organizations developing advanced AI systems.
As generative AI adoption continues to surge, efficient handling of inference workloads has become critical. Micron's HBM4 features a 2048-bit interface and delivers over 2.0 TB/s of bandwidth per stack—offering more than a 60% performance boost over the previous generation. This wide interface supports faster communication and higher throughput, accelerating inference performance for large language models and complex reasoning architectures such as chain-of-thought systems.
Power efficiency has also seen a significant leap. Compared to Micron's previous-generation HBM3E, HBM4 delivers over 20% improvement in power efficiency—building upon the industry benchmark first established by Micron. This enhancement maximizes data center efficiency by offering high throughput at minimal power consumption.
Generative AI use cases are expanding rapidly, promising transformative impacts across healthcare, finance, transportation, and beyond. HBM4 serves as a key enabler of this innovation, driving faster insights and discoveries across a wide range of industries.
"Micron's HBM4 delivers best-in-class performance, greater bandwidth, and industry-leading power efficiency—underscoring our leadership in memory technology," said Raj Narasimhan, Senior Vice President and General Manager of Micron's Cloud Memory Business Unit. "Building on the success of our HBM3E deployments, we continue to drive innovation through HBM4 and our comprehensive AI memory and storage portfolio. Our HBM4 production milestones are aligned with the readiness of next-gen AI platforms, ensuring seamless integration and volume ramp."
Micron also confirmed plans to scale up HBM4 production capacity by 2026, aligning with broader AI platform rollouts and accelerating time to market for high-performance computing solutions.