Qualcomm is exploring the use of Samsung’s next generation LPDDR6X memory in server grade Artificial Intelligence accelerator hardware, extending low power DRAM beyond its traditional role in mobile devices. After unveiling the AI200 and AI250 accelerator cards and racks in late October, Qualcomm drew attention by specifying LPDDR rather than the more common high bandwidth memory used in most cutting edge Artificial Intelligence enterprise platforms. According to information reported by The Bell in South Korea, Samsung Electronics has already shipped prototype LPDDR6X samples to Qualcomm, indicating early collaboration around this yet to be finalized memory technology.
Industry sources quoted by The Bell describe the move as unusual because the LPDDR6 (non X) standard received official JEDEC certification not that long ago (Q3 2025). The Samsung LPDDR6X development project is still in progress but has been explicitly linked to Qualcomm’s 2027 launch of the AI250 inference focused server platform. A source cited by the publication believes that a tight AI200 series release roadmap has driven the decision to supply LPDDR6X modules at such an early stage, allowing Qualcomm to validate and optimize its accelerators around the emerging standard well before commercial availability.
Rumors in the previous month suggested that the AI200 and AI250 inference accelerators are being designed with SOCAMM2 in mind, aligning Qualcomm’s hardware plans with new low power memory module formats. In December, Samsung’s memory product planning team outlined future branches of non X technology and stated that ‘low-power DRAM technologies such as SOCAMM’ are being spun off from ‘AI-focused module solution(s) leveraging LPDDR6 architecture.’ The combined signals from Qualcomm’s platform disclosures and Samsung’s memory roadmap point to a strategy where LPDDR6 and LPDDR6X derivatives, including SOCAMM and SOCAMM2, play a central role in the next generation of power efficient Artificial Intelligence inference servers.
