Welcome to SmBom Tesco!
All Products

Samsung Signs $3B HBM3E Supply Deal with AMD!

2024-04-25
twitter photos
facebook photos
instagram photos

Samsung Signs $3B HBM3E Supply Deal with AMD!

According to South Korean media reports, Samsung has signed a $3 billion supply agreement with AMD. Under this deal, Samsung will provide AMD with HBM3E 12H DRAM, a memory technology likely to be integrated into AMD's Instinct MI350 series AI chips. In return, Samsung will purchase AMD's GPUs, although specific product details and quantities are not yet specified.

In October 2023, Samsung hosted its "Samsung Memory Tech Day 2023," unveiling its next-generation HBM3E, codenamed Shinebolt. By February 2024, Samsung announced the development of the industry's first HBM3E 12H DRAM, featuring a 12-layer stack with a bandwidth of up to 1280GB/s and a 36GB capacity, making it the highest-performing and most capacious HBM product to date.

The HBM3E 12H utilizes advanced thermal compression non-conductive film (TC NCF) technology. This allows the 12-layer product to maintain the same height as the 8-layer model, adhering to current HBM packaging standards. This innovation offers numerous advantages, particularly in higher stacking configurations, as the industry seeks to minimize chip warping due to thinner chip designs. Samsung's ongoing reductions in NCF material thickness have achieved the industry's smallest inter-chip gap (7 μm), while also eliminating voids between layers. Compared to the HBM3 8H, these efforts have increased vertical density by over 20%. Samsung's advanced TC NCF also enhances HBM's thermal performance by incorporating bumps of different sizes between chips, using smaller bumps in signal areas and larger ones for heat dissipation. This approach also improves product yields.

Samsung notes that as AI applications experience exponential growth, HBM3E 12H is set to become the optimal solution for future systems that require increased memory. Its superior performance and capacity provide customers with greater resource management flexibility and reduce total cost of ownership (TCO) in data centers. When deployed in AI applications, HBM3E 12H is projected to boost average AI training speed by 34% and expand the number of inference service users by over 11.5 times compared to HBM3 8H. Samsung has already begun supplying HBM3E 12H samples to clients and anticipates starting mass production in the first half of this year.

Market reports suggest AMD plans to release its Instinct MI350 series AI chips in the latter half of 2024, as an upgrade to its Instinct MI300 AI chips. These chips will utilize TSMC's 4nm process technology to enhance computational performance while lowering power consumption. By incorporating the 12-layer stacked HBM3E, the series will achieve higher transmission bandwidths and increased memory capacity.

 

Solemnly declare: The copyright of this article belongs to the original author. The reprinted article is only for the purpose of disseminating more information. If the author's information is marked incorrectly, please contact us to modify or delete it as soon as possible. Thank you for your attention!