Samsung Electronics has strengthened its position to dominate both the 7th-generation HBM4E and custom AI chip (ASIC) markets after becoming the world's first company to ship 6th-generation High Bandwidth Memory (HBM4) to Nvidia, the world's largest AI accelerator company.

Micron, which had been competing for the HBM4 market, has reportedly been excluded from the initial supplier list after failing to meet Nvidia's performance requirements. This sets up a direct confrontation between market leader SK Hynix—which has dominated the 4th and 5th generation (HBM3 and HBM3E) markets—and Samsung Electronics.
Samsung's reclamation of first-supplier status for HBM4 after three years resulted from intense self-reflection. After ceding HBM priority supplier status to SK Hynix in 2023, Samsung faced a crisis as its memory business performance faltered due to heat dissipation issues and delayed additional shipments. Vice Chairman Young-hyun Jeon, who returned as head of the DS Division in May 2024, issued a public apology following the Q3 preliminary results announcement in October: "I am sorry for causing concern about our fundamental technological competitiveness and the company's future with results that fell short of market expectations." He then declared a commitment to restoring "fundamental technological competitiveness," signaling a comprehensive overhaul of organizational culture and work practices.
The technological breakthrough came through "bold process selection." During HBM4 development, Samsung applied its 6th-generation (1c-class, 11-nanometer) technology to DRAM—a more advanced process than competitors—and designed the logic die that connects AI accelerators to HBM at 4 nanometers. This was a calculated gamble to achieve overwhelming performance by adopting cutting-edge processes with higher technical difficulty than competitors using 5th-generation (1b) DRAM and 12-nanometer logic dies.
The strategy paid off. HBM4's operating speed reached 11.7 gigabits per second—an industry-leading figure that surpassed competitors' products. Samsung was ultimately selected as the first supplier of HBM4 for Nvidia's next-generation Vera Rubin AI accelerator, overtaking SK Hynix in HBM4 development after three years.
Additionally, Samsung delivered its delayed HBM3E to Nvidia in the second half of last year, reclaiming the top position in DRAM market share in Q4 after one year, according to Counterpoint Research. Industry observers expect Samsung to further consolidate its market leadership starting with HBM4.
Samsung is displaying confidence by revealing production plans for next-generation products. During its Q4 2025 earnings call on January 29, Samsung stated: "We have secured production-ready 16-layer stacking packaging technology for both HBM3E and HBM4." This indicates the company has completed preparations for 16-layer products—an improvement over the 12-layer HBM4 it will supply to Nvidia after the Lunar New Year holiday.
Samsung also aims to capture the 7th-generation HBM4E market. In October last year in San Jose, Samsung became the first in the industry to announce a target operating speed of 13 Gbps for HBM4E—30% faster than HBM4's 10 Gbps-plus speed. Samsung's strategy is to maintain technological leadership through advanced process technology and speed.
As Samsung expands its HBM4 influence, its position in the diversifying AI semiconductor market—spanning GPUs, CPUs, TPUs, and other "XPUs"—is expected to strengthen. Unlike previous generations focused on data processing, HBM4 requires a base die (logic die) with custom design at its bottom layer.
From HBM4 onward, integrated capabilities connecting design, manufacturing, and foundry services become essential—and Samsung is the only company in the world capable of providing such a "one-stop solution." An industry source explained: "Unlike competitors who ship their manufactured HBM to Taiwan for assembly, Samsung can handle the entire process in-house. The performance and supply volume of Samsung's HBM4 will determine the performance of next-generation AI accelerators."
