Samsung Electronics plans to begin producing its next-generation high-bandwidth memory, known as HBM4, starting next month and intends to supply the chips to Nvidia, according to a person familiar with the matter. The source did not share details such as the number of chips expected to be supplied.
The planned production ramp comes as Samsung works to strengthen its position in advanced memory used in AI computing, where rival SK Hynix has been a key supplier for Nvidia’s AI accelerators. Samsung has been trying to catch up after supply delays earlier last year affected its earnings and share price, according to the Reuters report carried by Firstpost.
Production and supply plans
A person familiar with the plan told Reuters that Samsung intends to start HBM4 production next month and supply Nvidia. A Samsung spokesperson declined to comment in the Reuters report, and Nvidia was not immediately available for comment.
While Reuters described the plan as supplying Nvidia, other reporting cited in the Reuters story points to broader customer qualification activity. South Korea’s Korea Economic Daily reported that Samsung passed HBM4 qualification tests for both Nvidia and AMD and would start shipping to Nvidia next month, citing chip industry sources, according to the Reuters report carried by Firstpost.
Reports of shipments to Nvidia and AMD
TrendForce, citing local media reports, said industry sources expect Samsung to begin official HBM4 shipments next month to major AI chip customers including Nvidia and AMD. TrendForce said the reports describe customers moving from requesting samples to placing mass-production orders after Samsung completed final qualification tests.
TrendForce also reported that Samsung’s HBM4 shipped in February is expected to be delivered to Nvidia and used right away in performance demonstrations tied to the Rubin AI accelerator, which the report said is set to debut at GTC 2026 in March. Separately, Reuters quoted Nvidia CEO Jensen Huang saying earlier this month that Nvidia’s next-generation chips, the Vera Rubin platform, are in “full production,” with the company preparing to launch chips paired with HBM4 later this year.
Claimed performance details
TrendForce reported that one local outlet said Samsung’s HBM4 reached a 11.7 Gb per second data rate, which the report said exceeds a 10 Gb speed requirement set by Nvidia and AMD. TrendForce also said the same report described the product as passing validation without needing redesign, even after customers requested performance improvements last year.
On manufacturing, TrendForce said Samsung is aiming for higher performance by adopting sixth-generation (1c) DRAM on a 10nm-class process and applying a 4nm foundry process for the logic die. TrendForce also reported that sources said Samsung gained lead time by sourcing its own 4nm logic die rather than depending on TSMC.
Competition and upcoming earnings
The Reuters report carried by Firstpost described SK Hynix as a primary supplier of advanced memory chips used for Nvidia’s AI accelerators. In market reaction noted in that report, Samsung shares rose 2.2% while SK Hynix shares fell 2.9% in morning trade.
Reuters also reported that SK Hynix said in October it completed HBM supply talks with major customers for next year. In addition, a SK Hynix executive told Reuters earlier this month that the company plans to begin deploying silicon wafers next month into a new fab, M15X, in Cheongju, South Korea, to produce HBM chips, without saying whether HBM4 would be part of initial production.
Both Samsung and SK Hynix are scheduled to announce fourth-quarter earnings on Thursday, when they are expected to share details of HBM4 orders, according to Reuters. TrendForce, meanwhile, said one report expects full-scale HBM4 supply to start around June, while also noting that shipment volumes will depend on customers’ mass-production schedules for their final products.
