JEDEC outlines LPDDR6 expansion for data centers

JEDEC has previewed planned updates to LPDDR6 aimed at pushing the memory standard beyond mobile devices and into selected data center and accelerated computing use cases. The roadmap includes higher-capacity packaging options, flexible metadata support, 512 GB densities, and a new SOCAMM2 module standard.

JEDEC has previewed a new set of features for the next version of its JESD209-6 LPDDR6 standard. Building on the foundational JESD209-6 published in July 2025, the JC-42.6 Subcommittee is working to extend LPDDR6 beyond mobile platforms to support selected data center and accelerated computing workloads that need a power-efficient, high-capacity memory platform.

A central change is a narrower per-die interface intended to raise memory capacity. With the move to a non-binary interface width – from x16 to x24, the inclusion of x12 and an additional x6 sub-channel mode, allows more die per package and higher memory capacities per component and per channel. JEDEC positions this as an important step for Artificial Intelligence-scale memory footprints and broader high-capacity deployments.

The planned update also introduces a flexible metadata carve-out intended to minimize impact to peak data throughput. That approach is meant to give data center customers more control over how they balance user capacity and metadata needs according to their own reliability requirements. The design focuses on adapting LPDDR6 for workloads that need both efficiency and operational flexibility.

Capacity expansion remains a major theme in the roadmap. 512 GB density on the horizon: LPDDR6 is expected to unlock densities beyond the current LPDDR5/5X maximum, a capability designed to address the ever-growing memory capacity requirements of Artificial Intelligence training and inference workloads. JEDEC is also actively developing an LPDDR6-based SOCAMM2 module standard, designed to carry the compact, serviceable module form factor forward and provide a clear upgrade path from today’s LPDDR5X SOCAMM2 modules.

58

Impact Score

Tsmc debuts A13 process technology

Tsmc has introduced its A13 process at its 2026 North America Technology Symposium as a tighter version of A14 aimed at next-generation Artificial Intelligence, high performance computing, and mobile designs. The company positions the node as a more compact and efficient option with backward-compatible design rules for faster migration.

Google unveils eighth-generation tensor processor units

Google introduced its eighth generation of custom tensor processor units with separate designs for training and inference. The new TPU 8t and TPU 8i are aimed at large-scale model training, serving, and agentic workloads.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.