Cadence Unveils 12.8 Gbps Gen2 DDR5 MRDIMM IP Validated on TSMC N3

Cadence launches DDR5 12.8 Gbps MRDIMM Gen 2 IP on TSMC N3, targeting soaring Artificial Intelligence data center demands.

Cadence has announced the industry´s first DDR5 12.8 Gbps MRDIMM Gen 2 memory IP system solution, validated on TSMC´s advanced N3 process. This new offering directly addresses the escalating requirements for higher memory bandwidth in enterprise and data center applications driven by Artificial Intelligence workloads, including those supporting cloud-based Artificial Intelligence deployments. The architecture builds on Cadence´s established DDR5 and GDDR6 product lines, promising scalability and adaptability for next-generation system demands.

The DDR5 MRDIMM Gen 2 IP solution delivers a comprehensive memory subsystem by integrating both a PHY and a high-performance controller. In recent hardware validation with Gen 2 MRDIMMs, the solution achieved a top-tier 12.8 Gbps data rate, effectively doubling the bandwidth found in standard DDR5 6400 Mbps DRAM modules. Features such as ultra-low latency encryption and advanced reliability, availability, and serviceability (RAS) underline Cadence´s commitment to robust, high-speed memory design, ensuring security and data integrity at scale.

Designed to support advanced system-on-chips (SoCs) and chiplets, the DDR5 MRDIMM Gen 2 IP offers flexible floor plan options to accommodate different integration requirements while allowing application-specific tuning for power and performance optimization. Engagements with leading customers in Artificial Intelligence, high-performance computing (HPC), and data centers underscore the solution´s early traction and leadership in advancing memory technology. The launch signals a significant step forward for hardware designers seeking to address the throughput and reliability challenges of modern data-driven infrastructures.

65

Impact Score

Legal risks around Artificial Intelligence data poisoning

Data poisoning is emerging as a major legal and operational risk in Artificial Intelligence systems, particularly through weaknesses in the upstream data supply chain. UK and EU rules are increasing pressure on organisations to strengthen due diligence, contracts and monitoring.

IBM, Red Hat, and Google donate llm-d to CNCF

IBM Research, Red Hat, and Google Cloud have donated llm-d, an open-source Kubernetes framework for large language model inference, to the CNCF as a sandbox project. The move aims to create a vendor-neutral blueprint for deploying scalable inference across models, accelerators, and clouds.

AAMU named regional lead for Amazon Web Services machine learning university

Alabama A&M University has been named a regional lead institution for Amazon Web Services Machine Learning University, expanding its role in Artificial Intelligence and machine learning education, research, and workforce development. The designation follows the university’s recent national HBCU summit on Artificial Intelligence and sets up new curriculum, faculty training, and student career pathways across the Southeast.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.