HBM chips are widely utilized in advanced Artificial Intelligence (AI) processors. Industry reports indicate that NVIDIA's rigorous quality testing poses a challenge for memory manufacturers, given the comparatively lower yield of HBM compared to traditional DRAM products.
Leading semiconductor manufacturers, including TSMC and Samsung, historically grappling with maintaining optimal yields during the processing of individual silicon wafers, now confront a similar challenge within the HBM sector. Notably, key players such as Micron and SK Hynix are set to compete in qualification tests for NVIDIA's next-generation AI GPU, with yield rates emerging as a significant hurdle.
In the intricate manufacturing process of HBM, the complexity of multi-layer stacking contributes to reduced yields. The use of Through Silicon Via (TSV) technology to interconnect small chips introduces additional opportunities for defects during manufacturing. Detecting a defect in one layer of the HBM stack results in the entire stack being discarded, presenting a formidable challenge in enhancing yield rates.
Market insights suggest that the overall yield rate for HBM storage chips currently stands at approximately 65%, with Micron and SK Hynix appearing to hold a leading position in this competitive landscape. Micron has already commenced production of HBM3e storage chips for NVIDIA's latest H200 AI GPU, having successfully passed the certification stage set by the industry giant.
Kim Ki-tae, Vice President of SK Hynix, conveyed in an official statement on February 21 that despite prevailing external uncertainties, the memory chip market is poised for gradual recovery in the current year. Anticipated increases in demand for applications such as PCs and smartphones are expected to propel sales of HBM3e and stimulate demand for DDR5, LPDDR5T, and related products. The executive highlighted that all HBM inventory has been depleted, and the company is strategically preparing for the upcoming year, 2025.