On February 27th, Samsung announced the development of the industry's first 12-layer stacked HBM3E 12H high-bandwidth storage chip, boasting the highest capacity to date at 36GB and a remarkable bandwidth of 1280GB/s. Compared to the 8-layer stacked HBM3 products, this new innovation demonstrates an increase of over 50% in both capacity and bandwidth, significantly enhancing the speed of artificial intelligence (AI) training and inference.
Yongcheol Bae, Vice President of Storage Product Planning at Samsung Electronics, noted the growing demand among AI service providers for higher-capacity HBM. The newly developed HBM3E 12H product is specifically designed to meet this increasing requirement.
Technologically, Samsung's HBM3E 12H employs advanced thermal compression non-conductive film (TC NCF), ensuring that the 12-layer product maintains the same height as 8-layer HBM chips to meet current HBM packaging requirements. This technology is expected to bring additional advantages in the future, particularly in higher layer stacking, addressing challenges associated with the thinning of bare chips. Samsung has continuously reduced the thickness of NCF material and achieved the industry's smallest chip gap (7 micrometers), eliminating interlayer gaps. The progress of this new technology has increased vertical density by over 20% compared to the previous HBM3 8H product.
Samsung states that TC NCF technology can also enhance HBM's thermal performance by using different-sized bumps between the chips. Smaller bumps are utilized for signal transmission areas, while larger bumps aid in areas requiring heat dissipation. This method contributes to improved product yield.
NVIDIA's flagship AI chip, H200, has already adopted HBM3E storage, and the next-generation B100 is expected to follow suit. Major players in the storage chip industry, including Samsung, SK Hynix, and Micron, are strategically focusing on HBM technology.
Samsung asserts that HBM3E 12H will emerge as the premier solution for the future, reducing total cost of ownership (TCO) for data centers. In terms of performance, the new product offers a 34% average increase in AI training speed compared to HBM3 8H and supports over 11.5 times more users for inference services.
Samsung has initiated the distribution of HBM3E 12H samples to customers and plans to commence mass production in the first half of this year.