Nvidia taps Micron for large-scale SOCAMM memory rollout in 2025

Nvidia will deploy up to 800,000 SOCAMM modules in 2025, with Micron beating Samsung and SK Hynix to the production lead for its next-generation Artificial Intelligence platforms.

Nvidia has announced plans to launch between 600,000 and 800,000 SOCAMM memory modules in 2025 as it pivots away from traditional high-bandwidth memory solutions. The company is integrating SOCAMM technology into its forthcoming Artificial Intelligence-focused products such as the GB300 ´Blackwell´ platform and the new AI PC Digits system, which it recently showcased at GTC 2025. According to industry sources, Nvidia has provided detailed projections to memory and substrate suppliers, signifying a coordinated push for volume deployment. The company collaborated with Samsung Electronics, SK Hynix, and Micron Technology for SOCAMM co-development, but reports from Digitimes Asia, ET News, and Wccftech confirm Micron as the first to secure approval for full-scale production.

SOCAMM, short for System on Chip Advanced Memory Module, capitalizes on LPDDR DRAM to achieve higher bandwidth with increased efficiency. Micron claims that SOCAMM delivers 2.5 times the bandwidth of conventional server RDIMM modules, all while reducing module size and power usage by one third. This blend of heightened performance and lower energy consumption directly addresses the growing demands of next-generation Artificial Intelligence workloads. Although the planned SOCAMM deployment is smaller in scale compared to Nvidia´s anticipated procurement of 9 million HBM units for the same year, it represents a meaningful shift in the memory technology landscape. Analysts see SOCAMM as a potential disruptor for both the memory and substrate ecosystems, blending affordability with top-tier data throughput.

The rollout strategy for SOCAMM targets both enterprise and consumer sectors, beginning with AI servers and professional workstations before expanding into personal computers. By driving down costs and power requirements while amplifying bandwidth, SOCAMM positions itself as a scalable alternative to traditional memory solutions in Artificial Intelligence applications. Nvidia´s selection of Micron as its volume supplier underscores the latter´s rapid innovation and manufacturing capability gains over leading competitors. The industry´s embrace of SOCAMM is expected to shape future memory architectures, opening the market to new possibilities in high-performance computing for business and everyday users alike.

75

Impact Score

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.

Please check your email for a Verification Code sent to . Didn't get a code? Click here to resend