Micron launches 192 GB socamm2 for low-power Artificial Intelligence data centers

Micron is sampling a 192 GB socamm2 module to expand adoption of low-power memory in Artificial Intelligence data centers. The module increases capacity in the same footprint and promises large reductions in time to first token for real-time inference.

Micron Technology announced customer sampling of a 192 GB socamm2, a small outline compression attached memory module designed to broaden adoption of low-power DRAM in Artificial Intelligence data centers. The socamm2 builds on Micron’s earlier low-power DRAM socamm and delivers 50 percent more capacity in the same compact footprint. Micron positions the added capacity as a direct enabler for performance improvements in real-time inference workloads.

The company says the additional capacity can cut time to first token, or TTFT, by more than 80 percent for real-time inference. The 192 GB socamm2 uses Micron’s 1-gamma DRAM process technology and delivers greater than 20 percent improvement in power efficiency, enabling tighter power design optimization across large data center clusters. Those efficiency gains become more significant at scale, where full-rack installations can include more than 40 terabytes of CPU-attached low-power DRAM main memory.

Micron highlights the socamm2’s modular design as improving serviceability and providing a pathway for future capacity expansion in data center deployments. The announcement focuses on customer sampling as the next step toward broader market availability and frames the product as a response to the industry shift toward more energy-efficient infrastructure to support growth in Artificial Intelligence workloads.

55

Impact Score

Microsoft launches Copilot Health in the US

Microsoft has introduced Copilot Health as a protected space inside Copilot that combines medical records, wearable data and lab results into personalised health insights. The service is launching first for adults in the US with strong privacy controls and a limited initial rollout.

Tesla plans terafab for Artificial Intelligence chips

Tesla is moving toward a large-scale chip manufacturing project to support its autonomous driving roadmap. Elon Musk said the terafab effort for Artificial Intelligence chips will launch in seven days and may involve Intel, TSMC and Samsung.

Timeline traces evolution, civilisation and planetary stewardship

A sweeping chronology links cosmology, evolution, human history and modern environmental risk in a single long view of the human condition. The sequence culminates in contemporary debates over climate change, biodiversity loss and artificial intelligence governance.

Wolters Kluwer report tracks Artificial Intelligence shift in legal work

Wolters Kluwer’s 2026 Future Ready Lawyer findings show Artificial Intelligence has become a foundational tool across law firms and corporate legal departments. The survey points to measurable time savings, revenue growth, and rising pressure to strengthen training, ethics, and security.

Anthropic March 2026 release roundup

Anthropic rolled out a broad set of March 2026 updates across Claude Code, the Claude Developer Platform, Claude apps, and enterprise partnerships. Changes focused on larger context windows, workflow improvements, reliability fixes, visual output features, and new partner enablement programs.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.