According to a report from South Korea's MK News, Samsung has officially initiated the development of its next-generation HBM4 memory. The company is expected to provide custom HBM4 memory solutions for two major players in the AI cloud services sector: Meta and Microsoft. This marks the first time that Samsung's HBM4 technology will be integrated into mainstream AI solutions.
Unlike previous generations such as the HBM3E, which utilized DRAM-based base dies, Samsung's HBM4 will shift to a logic-based base die. This change aims to significantly enhance both performance and energy efficiency. The logic base die, positioned beneath the DRAM, acts as a vital connector between AI accelerators, such as GPUs, and memory. It also serves as a controller for the data flow between the GPU and DRAM. Unlike previous base dies, this logic die offers greater flexibility for customization, allowing clients to integrate their own intellectual property (IP), enabling more tailored solutions and more efficient data processing. This innovation is expected to reduce power consumption by up to 70%, potentially cutting energy use to just 30% of previous levels.
Samsung's logic base die will be manufactured using its advanced 4nm process technology, developed in-house by its semiconductor division. The custom dies will be stacked with 10nm 6th generation 1c DRAM for optimal performance. Reports indicate that the custom HBM4 for Microsoft could be deployed in the company's "Maia 100" AI chip, while Meta's Artemis AI chip may also benefit from Samsung's cutting-edge memory technology.