Part #/ Keyword
All Products

SK Hynix Delivers First 12-Layer HBM4 Samples to Clients

2025-03-20 13:42:57Mr.Ming
twitter photos
twitter photos
twitter photos
SK Hynix Delivers First 12-Layer HBM4 Samples to Clients

On March 19, 2025, SK hynix, a leading memory chip manufacturer, announced the official launch of its ultra-high-performance 12-layer HBM4 DRAM product tailored for AI applications. The company has also provided samples of this groundbreaking technology to key clients for the first time globally.

Building on its industry-leading expertise in HBM technology and production, SK hynix emphasized that the 12-layer HBM4 samples are being shipped ahead of schedule, with client validation processes already underway. The company is set to complete mass production preparations in the second half of the year, reinforcing its leadership position in the next-generation AI memory market.

The newly introduced 12-layer HBM4 DRAM delivers the highest-speed performance required for AI memory applications, with exceptional data processing capabilities. It offers a bandwidth of over 2TB per second, marking a significant leap in performance compared to its predecessor (HBM3E). This bandwidth is equivalent to processing over 400 full HD movies (5GB each) in just one second, representing a 60% performance increase over previous generations.

In addition, SK hynix has leveraged its Advanced MR-MUF technology, which was proven in earlier products, to achieve a maximum capacity of 36GB for the 12-layer HBM. This innovation addresses chip warping and significantly improves thermal management, thereby enhancing overall product stability.

Since the introduction of HBM3 in 2022, SK hynix has steadily advanced its HBM product lineup, achieving mass production of 8-layer and 12-layer HBM3E models in 2024. The company's timely development and delivery of HBM products have allowed it to maintain its leadership in the AI memory market.

Jin Ju-shan, President and CMO of SK hynix's AI Infra division, stated, “To meet customer demands, we continuously overcome technological challenges and remain at the forefront of AI ecosystem innovation. With the industry's largest HBM supply experience, we are fully prepared for performance validation and mass production.”

* Solemnly declare: The copyright of this article belongs to the original author. The reprinted article is only for the purpose of disseminating more information. If the author's information is marked incorrectly, please contact us to modify or delete it as soon as possible. Thank you for your attention!