
As AI demand surges, storage chip prices are skyrocketing, reshaping profitability in the semiconductor industry. According to the Korea Economic Daily, Samsung Electronics' and SK Hynix's memory divisions are expected to achieve gross margins in Q4 2025 that surpass TSMC's—a milestone for the storage sector not seen since Q4 2018.
Analysts predict Samsung and SK Hynix will hit gross margins between 63% and 67%, higher than TSMC's forecasted 60%. Micron, the world's third-largest memory chip maker, reported a 56% gross margin for its first fiscal quarter of 2026 (September–November 2025) and expects it to rise to 67% in Q2, indicating potential to outperform TSMC in early 2026.
Rapid increases in storage chip prices are driving this profitability growth. The top three memory manufacturers have allocated roughly 18%–28% of DRAM capacity to high-bandwidth memory (HBM). HBM, which stacks 8 to 16 DRAM chips, significantly reduces the supply of general-purpose DRAM, leading to single-quarter price jumps exceeding 30%.
The shift in AI from "training" to "inference" is also fueling demand. Inference applies knowledge gained during training to solve real-world problems, requiring fast data storage and retrieval. HBM and similar memory solutions store and feed data continuously to GPUs, pushing storage chip margins above those of wafer foundries.
While general-purpose DRAM lags behind HBM in performance, early-stage AI inference workloads often rely on DRAM like GDDR7 and LPDDR5X, reserving HBM for more intensive tasks. For instance, NVIDIA uses GDDR7 in inference-focused AI accelerators.
Looking ahead, memory manufacturers aim to maintain dominance in the AI era with high-performance, AI-optimized products. Memory-in-Processing (PIM) technology, which allows memory to handle tasks traditionally managed by GPUs, is one example. Innovations such as vertical channel transistor (VCT) DRAM and 3D DRAM promise higher data density by storing more information in smaller areas, further boosting storage efficiency.