Samsung raises DDR5 prices up to 60% as demand for Artificial Intelligence data centers tightens

Samsung has raised prices for several DDR5 memory modules by up to 60% amid tightening supply as companies race to build Artificial Intelligence data centers. The move follows a delayed contract pricing update and is adding pressure on hyperscalers and original equipment manufacturers.

Samsung has raised prices for several server-grade DDR5 memory modules by as much as 60% this month, according to people familiar with the changes. The increases came after Samsung delayed its usual October contract pricing update and reflect tightening supply tied to the global push to build Artificial Intelligence data centers. Sources reported sharp month-to-month jumps across multiple module densities, with 16 gigabyte and 128 gigabyte modules up by about 50 percent and 64 gigabyte and 96 gigabyte modules up by more than 30 percent.

The pricing surge is straining hyperscalers and original equipment manufacturers that are already short on parts. Fusion Worldwide president Tobey Gonnerman told Reuters that many major server builders accept they will not get enough product and that premiums have reached extreme levels. The shortage has prompted panic buying among some customers and is spilling into other parts of the semiconductor market. Semiconductor Manufacturing International Corporation said tight DDR5 supply has caused buyers to hold back orders for other chip types, and Xiaomi warned that rising memory costs are pushing up smartphone production expenses.

The squeeze is strengthening Samsung’s memory division even as the company trails rivals in advanced Artificial Intelligence processors. Market tracker TrendForce expects Samsung’s contract prices to rise 40 to 50 percent in the fourth quarter, above the roughly 30 percent increase forecast for the broader market, supported by strong demand and long-term supply deals extending into 2026 and 2027. Sources indicated the imbalance in server DRAM supply is giving Samsung more pricing leverage relative to SK Hynix and Micron, reshaping competitiveness in the memory market as data center buildouts accelerate.

58

Impact Score

Rdma for s3-compatible storage accelerates Artificial Intelligence workloads

Rdma for S3-compatible storage uses remote direct memory access to speed S3-API object storage access for Artificial Intelligence workloads, reducing latency, lowering CPU use and improving throughput. Nvidia and multiple storage vendors are integrating client and server libraries to enable faster, portable data access across on premises and cloud environments.

technologies that could help end animal testing

The uk has set timelines to phase out many forms of animal testing while regulators and researchers explore alternatives. The strategy highlights organs on chips, organoids, digital twins and Artificial Intelligence as tools that could reduce or replace animal use.

Nvidia to sell fully integrated Artificial Intelligence servers

A report picked up on Tom’s Hardware and discussed on Hacker News says Nvidia is preparing to sell fully built rack and tray assemblies that include Vera CPUs, Rubin GPUs and integrated cooling, moving beyond supplying only GPUs and components for Artificial Intelligence workloads.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.