
According to Korean media outlet Money Today, leading memory manufacturers including Samsung Electronics, SK Hynix, and Micron Technology are accelerating their development and market positioning for seventh-generation high-bandwidth memory (HBM4E), a key next-generation memory standard expected to enter rapid growth from 2027 onward.
Industry momentum is being strongly driven by major AI and semiconductor ecosystem players such as NVIDIA, Advanced Micro Devices, and Google, all of which are planning to integrate HBM4E into their next-generation AI chips. This shift is expected to significantly expand demand for high-performance memory solutions.
On the product roadmap side, NVIDIA is reportedly preparing to deploy HBM4E in its next-generation AI accelerator, the Vera Rubin Ultra platform, targeting up to 1TB of memory capacity—substantially higher than the 288GB configuration used in the previous Vera Rubin generation. AMD is also expected to introduce HBM4E in its Instinct MI500 accelerator series based on the CDNA 6 architecture, scheduled for 2027. AMD CEO Lisa Su indicated at CES 2026 that the product will be manufactured using TSMC’s 2nm-class process technology. Google is likewise evaluating the adoption of HBM4E in its next-generation Tensor Processing Units (TPUs).
To secure early market advantage, Samsung plans to deliver its first HBM4E samples to key customers in May 2026. The solution is expected to utilize a 1c DRAM process node and will be paired with Samsung’s 4nm-class foundry technology for base die production. The company is also planning to introduce next-generation hybrid bonding technology in selected HBM4E stacks to enhance interconnect density and performance, aiming to reinforce its leadership in advanced memory solutions.
SK Hynix is targeting HBM4E sample shipments in the second half of 2026, upgrading its DRAM process from 1b to 1c. Its base die may also leverage TSMC’s 3nm-class manufacturing process to improve performance and efficiency, while maintaining stable supply capabilities. Micron is following a similar development path, building HBM4E on a 1c DRAM process with a 3nm-class base die manufactured via TSMC, with mass production targeted for the second half of 2027.
Industry analysts in South Korea note that HBM4E represents a major leap not only in memory capacity but also in process integration, advanced DRAM scaling, and packaging technologies. As leading AI chipmakers finalize their roadmaps, competition among memory manufacturers is expected to intensify significantly, with performance, yield, and advanced packaging capability becoming decisive factors in the next phase of the high-bandwidth memory market.