Nvidia taps Micron for large-scale SOCAMM memory rollout in 2025

Nvidia will deploy up to 800,000 SOCAMM modules in 2025, with Micron beating Samsung and SK Hynix to the production lead for its next-generation Artificial Intelligence platforms.

Nvidia has announced plans to launch between 600,000 and 800,000 SOCAMM memory modules in 2025 as it pivots away from traditional high-bandwidth memory solutions. The company is integrating SOCAMM technology into its forthcoming Artificial Intelligence-focused products such as the GB300 ´Blackwell´ platform and the new AI PC Digits system, which it recently showcased at GTC 2025. According to industry sources, Nvidia has provided detailed projections to memory and substrate suppliers, signifying a coordinated push for volume deployment. The company collaborated with Samsung Electronics, SK Hynix, and Micron Technology for SOCAMM co-development, but reports from Digitimes Asia, ET News, and Wccftech confirm Micron as the first to secure approval for full-scale production.

SOCAMM, short for System on Chip Advanced Memory Module, capitalizes on LPDDR DRAM to achieve higher bandwidth with increased efficiency. Micron claims that SOCAMM delivers 2.5 times the bandwidth of conventional server RDIMM modules, all while reducing module size and power usage by one third. This blend of heightened performance and lower energy consumption directly addresses the growing demands of next-generation Artificial Intelligence workloads. Although the planned SOCAMM deployment is smaller in scale compared to Nvidia´s anticipated procurement of 9 million HBM units for the same year, it represents a meaningful shift in the memory technology landscape. Analysts see SOCAMM as a potential disruptor for both the memory and substrate ecosystems, blending affordability with top-tier data throughput.

The rollout strategy for SOCAMM targets both enterprise and consumer sectors, beginning with AI servers and professional workstations before expanding into personal computers. By driving down costs and power requirements while amplifying bandwidth, SOCAMM positions itself as a scalable alternative to traditional memory solutions in Artificial Intelligence applications. Nvidia´s selection of Micron as its volume supplier underscores the latter´s rapid innovation and manufacturing capability gains over leading competitors. The industry´s embrace of SOCAMM is expected to shape future memory architectures, opening the market to new possibilities in high-performance computing for business and everyday users alike.

75

Impact Score

Adobe plans outcome-based pricing for Artificial Intelligence agents

Adobe is positioning its Artificial Intelligence agents around performance-based pricing, charging only when the software completes useful work. The approach points to a more results-oriented model for selling generative Artificial Intelligence tools to business customers.

Tech firms commit billions to Artificial Intelligence infrastructure

Amazon, OpenAI, Nvidia, Meta, Google and others are signing increasingly large cloud, chip and data center agreements as demand for Artificial Intelligence infrastructure accelerates. The latest wave of deals spans investments, compute purchases, chip supply agreements and data center buildouts.

JEDEC outlines LPDDR6 expansion for data centers

JEDEC has previewed planned updates to LPDDR6 aimed at pushing the memory standard beyond mobile devices and into selected data center and accelerated computing use cases. The roadmap includes higher-capacity packaging options, flexible metadata support, 512 GB densities, and a new SOCAMM2 module standard.

Tsmc debuts A13 process technology

Tsmc has introduced its A13 process at its 2026 North America Technology Symposium as a tighter version of A14 aimed at next-generation Artificial Intelligence, high performance computing, and mobile designs. The company positions the node as a more compact and efficient option with backward-compatible design rules for faster migration.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.