Samsung reportedly slashes HBM3 prices to woo Nvidia — cuts could put the heat on rivals SK hynix and Micron as company attempts to spur AI turnaround

Samsung’s semiconductor business is coming off a rough quarter, with profits plunging nearly 94% year-over-year. It’s the weakest the chip division has looked in six quarters, and the culprit isn’t hard to identify. Between U.S. export restrictions limiting the sale of advanced chips to China, and persistent inventory corrections, the company’s Device Solutions division recorded just 400 billion won ($287 million) in profit for Q2 2025—a steep fall from the 6.5 trillion won ($4.67 billion) it pulled in the same period last year. However, Samsung is betting big that artificial intelligence will flip that narrative by the end of the year.

The recovery, as Samsung sees it, lies in HBM3E—the latest generation of High Bandwidth Memory used to feed data to AI accelerators at blazing speeds. According to a ZDNet Korea report, the company is actively working to lower the production costs of HBM3E in an effort to court Nvidia, which has so far leaned heavily on SK hynix for its AI GPU supply chain.

Samsung’s strategy is simple: make HBM3E memory more affordable and available than anyone else and become indispensable to the future of AI computing. For context HBM3E (High Bandwidth Memory Gen 3E) is critical in modern AI accelerators, especially for large language model training where bandwidth and capacity bottlenecks limit performance. It’s the memory found on Nvidia’s top-end AI GPUs like the B300.

You may like

  • Intel and SoftBank collaborate on power-efficient HBM substitute for AI data centers

  • Samsung to adopt hybrid bonding for HBM4 memory

  • HBM development roadmap revealed: HBM8 with a 16,384-bit interface and embedded NAND in 2038

The company’s memory division’s quarterly sales rose 11% from Q1, reaching 21.2 trillion won ($15.2 billion), thanks in part to the expansion of HBM3E and high-density DDR5 for servers. NAND inventory is being cleared out faster too, with server SSD sales picking up. For the second half, Samsung plans to ramp up its production of 128GB DDR5, 24Gb GDDR7, and 8th-gen V-NAND, particularly for AI server deployments.

Comments (0)
Add Comment