
According to industry sources, Samsung Electronics is preparing to produce the first samples of its next-generation high-bandwidth memory, HBM4E, as early as May, with the goal of completing internal validation before delivering chips to NVIDIA.
The company is accelerating development of its seventh-generation HBM products to strengthen its position in the rapidly expanding AI memory market. Samsung aims to achieve target performance levels in initial samples before providing them to customers, ensuring readiness for high-performance computing workloads used in advanced AI systems.
According to reports, Samsung’s foundry division is expected to complete HBM4E logic chip samples in mid-May. These logic components will then be transferred to the memory division for integration with DRAM and advanced packaging. The final assembled samples will undergo internal performance evaluation prior to shipment to NVIDIA for qualification testing.
Samsung previously showcased a physical version of the HBM4E chip at the GTC 2026 event in March. However, industry observers noted that the demonstration sample was primarily for presentation purposes and had not yet reached full commercial performance standards. The HBM4E design is expected to deliver data transfer speeds of up to 16 Gbps per pin and a bandwidth of approximately 4.0 TB/s, representing a significant improvement over current HBM4 technology.
To secure an early lead in HBM4 mass production, Samsung is adopting more advanced process technologies compared with competitors. Industry sources indicate the logic die may be manufactured using a 4nm process, while the DRAM component is expected to utilize a 10nm-class (1c-level) process node.
Rival SK Hynix is also accelerating its HBM4E development roadmap, reportedly planning to use similarly advanced DRAM and logic process technologies in its next-generation products, intensifying competition in the high-bandwidth memory segment.
The production timeline for NVIDIA Vera Rubin AI platforms, which are expected to adopt HBM4 and HBM4E memory solutions, has undergone adjustments. However, Samsung is pushing forward aggressively to avoid repeating delays seen in its HBM3E rollout and to strengthen its position in next-generation AI memory supply chains.