Samsung SOCAMM2 LPDDR module targets next generation artificial intelligence data centers

Samsung has introduced SOCAMM2, an LPDDR based server memory module with a modular, detachable design, aimed at improving bandwidth, power efficiency, and integration in artificial intelligence data centers. The company is already supplying customer samples as demand rises for low power memory tailored to continuous artificial intelligence workloads.

As artificial intelligence adoption accelerates worldwide, data centers are facing rapid growth in computational workloads that increasingly prioritize both performance and energy efficiency. The shift from large scale model training to continuous inference means systems must sustain heavy artificial intelligence workloads over long periods while keeping power consumption in check. This changing usage pattern is driving strong interest in low power memory technologies that can support nonstop artificial intelligence services without overwhelming data center energy budgets.

Responding to this trend, Samsung has introduced SOCAMM2, short for Small Outline Compression Attached Memory Module, as a new LPDDR based server memory option for artificial intelligence data centers. The company states that SOCAMM2 is already being sampled to customers, signaling that the design has progressed beyond early concept stages and into practical evaluation by server makers. By building on LPDDR technology, which is traditionally known for lower power consumption compared to conventional server memory, Samsung is positioning SOCAMM2 as a component that can help operators balance performance and efficiency in artificial intelligence servers.

SOCAMM2 combines the inherent low power benefits of LPDDR with a modular, detachable form factor designed for flexible system integration. According to Samsung, this approach allows the module to deliver higher bandwidth and improved power efficiency while also making it easier for system designers to configure and scale memory within artificial intelligence server platforms. The company emphasizes that this combination of bandwidth, energy savings, and modular design is intended to enable artificial intelligence servers to achieve greater overall efficiency and scalability as data centers adapt to the demands of continuous inference workloads.

52

Impact Score

SK Group warns DRAM shortages could curb memory use

SK Group chairman Chey Tae-won warned that customers may reduce memory consumption through infrastructure and software optimization if DRAM suppliers fail to raise output. Demand from Artificial Intelligence data centers is keeping the market tight as memory makers weigh expansion against the long timelines for new fabs.

BitUnlocker bypasses TPM-only Windows 11 BitLocker

Intrinsec disclosed BitUnlocker, a downgrade attack that can bypass TPM-only Windows 11 BitLocker protections with physical access to a machine. The technique abuses a flaw in Windows recovery and deployment components and relies on older trusted boot code.

Micron samples 256 GB DDR5 9200 MT/s RDIMM server modules

Micron has begun sampling 256 GB DDR5 RDIMM server modules built on its 1-gamma technology to key ecosystem partners. The company positions the new modules as a higher-speed, more power-efficient option for scaling next-generation Artificial Intelligence and HPC infrastructure.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.