It looks like Samsung may be partnering up with Nvidia. Samsung Electronics said on Friday it is in “close discussion” to supply its next-generation high-bandwidth memory (HBM) chips, or HBM4, to Nvidia, as the South Korean chipmaker scrambles to catch up with rivals in the AI chip race.
High Bandwidth Memory (HBM) chips are a type of high-performance RAM designed to provide extremely fast data transfer rates while using less power and space than traditional memory like DDR. Unlike standard DRAM modules, HBMs are stacked vertically in multiple layers and connected with through-silicon vias (TSVs), which allows them to transfer data very quickly between layers and to the processor.
READ: What does Nvidia’s $100 billion investment in OpenAI mean? (September 24, 2025)
HBM is commonly used in graphics cards, AI accelerators, supercomputers, and data centers, where massive bandwidth is crucial for tasks like machine learning, 3D rendering, and scientific simulations. For example, HBM2 and HBM3 can deliver hundreds of gigabytes per second of bandwidth per stack, compared to tens of gigabytes for conventional GDDR memory.
Local rival SK Hynix, Nvidia’s top HBM chip supplier, on Wednesday said it aims to start shipping its latest HBM4 chips in the fourth quarter and expand sales next year.
Nvidia relies heavily on High-Bandwidth Memory (HBM) for its high-end GPUs, particularly those used in AI and data-center workloads. HBM offers much higher memory bandwidth per pin than traditional GDDR memory, allowing Nvidia GPUs to feed massive AI models efficiently while generally reducing latency and power consumption. However, Nvidia does not manufacture HBM itself; it sources these chips from suppliers like SK Hynix and Micron. This dependency gives memory suppliers significant influence, though Nvidia is increasingly trying to regain leverage by planning to influence the logic-die design of HBM starting around 2027.
Samsung, which plans to market the new chip next year, did not specify when it aims to ship the latest version of its HBM chip, a key building block of artificial intelligence chipsets.
To mitigate supply risks, Nvidia has pressed suppliers to accelerate delivery of next-generation HBM4 chips, reflecting the urgency of high-bandwidth memory for AI growth. As of 2025, HBM4 is in sample or early production stages, with mass production expected later in the year. While HBM boosts performance, it is costly and complex to produce. Some industry commentary suggests Nvidia may explore hybrid memory approaches combining HBM with cheaper memory types like GDDR7, though this has not been publicly confirmed. Supply constraints and technological complexity remain risks, making HBM a central element of Nvidia’s strategy to maintain AI GPU leadership.
Reportedly, Jeff Kim, head of research at KB Securities, said HBM4 likely needs further testing but Samsung widely is seen to be in a favourable position given its production capacity.
“If Samsung supplies HBM4 chips to Nvidia, it could secure a significant market share that it was unable to achieve with previous HBM series products,” Kim said.
The developments around HBM4 supply for Nvidia highlight the growing strategic importance of high-bandwidth memory in the AI and data-center GPU market. With Nvidia relying heavily on HBM for feeding large AI models efficiently, securing a stable supply of next-generation memory is critical for maintaining its performance and competitiveness. SK Hynix remains a key supplier, but Samsung’s potential partnership could introduce more supply diversity, mitigate risks, and intensify competition among memory vendors. While HBM offers significant performance advantages, it is costly and complex to produce, making supply management a crucial element of Nvidia’s strategy. The involvement of multiple suppliers may also influence pricing, delivery schedules, and the broader AI chip ecosystem. Overall, the push for HBM4 underscores how high-performance memory has become a cornerstone for advancing AI hardware, shaping market dynamics, and determining which companies can sustain leadership in this rapidly evolving sector.

