
According to sources in the memory supply chain, Google's expansion of its TPU platform is set to generate significant new memory demand, putting further upward pressure on memory prices. Currently, SK Hynix has emerged as the largest provider of fifth-generation high-bandwidth memory (HBM3E) for Google and Broadcom. At the same time, Meta's plans to deploy Google TPUs in its data centers could pose a potential challenge to NVIDIA.
Despite rising competition in the AI chip market, NVIDIA's strong CUDA ecosystem continues to provide a short-term edge that is difficult to displace. On the memory side, the market remains dominated by SK Hynix, Samsung, and Micron, with no major shifts in the monopoly landscape.
Recently, the three leading DRAM makers have shared updates on HBM4 development. SK Hynix's HBM4 is expected to reach commercialization in Q4 2025, ahead of market expectations. Micron's CFO Mark Murphy confirmed that HBM3E and HBM4 capacity for 2026 is already fully booked, with HBM4 shipments set to begin in Q2 2026. Samsung's HBM4 is also expected to pass final certification by early 2026.
Analysts expect HBM4 to account for over 50% of total HBM sales in the second half of 2026, becoming the market standard. Prices are projected to rise sharply, roughly 60–70% above HBM3E, reaching around $550 per unit. Next-generation chips, including NVIDIA’s Rubin and AMD's MI400, will adopt HBM4, driving substantial shipment growth in late 2026.
Looking ahead, SK Hynix plans to launch its next-generation HBM4E, based on a 1c process node, between late 2026 and early 2027. While Samsung has been active in the HBM3E space, its 2025 H1 products failed NVIDIA validation due to performance and stability issues, raising questions about supply reliability.
To control costs and optimize its supply chain, Google is rolling out its seventh-generation Ironwood TPU, co-designed with Broadcom. SK Hynix, leveraging a customized HBM strategy, is well-positioned to benefit from this wave of TPU-scale deployments.
From a chip architecture perspective, each Google TPU requires four HBM stacks. Memory demand has surged from 32GB per TPU in the early TPU v4 generation to 192GB in the Ironwood TPU. Industry observers note that SK Hynix holds a strategic advantage in emerging ASIC and TPU markets. In 2026, besides NVIDIA, Google TPU expansion is expected to drive substantial new memory demand, further reinforcing SK Hynix's market position.