AMD Launches Pensando Pollara 400 AI NIC to Accelerate Scalable Artificial Intelligence Workloads

AMD unveils the Pensando Pollara 400, a fully programmable network card designed to optimize large-scale Artificial Intelligence and machine learning deployments.

AMD has officially announced the general availability and shipping of its innovative Pensando Pollara 400 AI NIC, targeting the accelerating demands of Artificial Intelligence, large language models, and agentic Artificial Intelligence applications. The surge in these advanced workloads is prompting data centers and enterprises to seek parallel computing infrastructures that not only offer superior performance but are also adaptable to evolving Artificial Intelligence and machine learning needs. A critical technological challenge in this context is scaling intra-node GPU-GPU communication networks for maximum efficiency.

Embedding its commitment to open ecosystems and customer choice, AMD designed the Pensando Pollara 400 AI NIC as a fully programmable network interface controller aligned with the nascent standards defined by the Ultra Ethernet Consortium (UEC). This new offering brings the promise of reducing total cost of ownership by enabling flexible, future-proof data center architectures without compromising on performance. The Pollara 400 is engineered to provide the scalable, high-throughput connectivity required for Artificial Intelligence and machine learning clusters, making it easier for organizations to build and expand robust Artificial Intelligence infrastructure.

The launch signifies AMD’s strategic focus on both innovation and open standards across Artificial Intelligence networking. By supporting high-speed, programmable, and UEC-compliant network capabilities, the Pensando Pollara 400 is positioned as a pivotal component for next-generation Artificial Intelligence data centers. With the product now available and shipping to customers, AMD is reinforcing its role in accelerating Artificial Intelligence deployment at scale, ensuring organizations can meet both current and future infrastructure demands for increasingly complex and resource-intensive Artificial Intelligence workloads.

71

Impact Score

Port Washington vote challenges Artificial Intelligence data center expansion

Port Washington, Wisconsin, voters approved a measure that gives residents more control over large tax-incentivized development projects tied to the Artificial Intelligence infrastructure boom. The local pushback is emerging as a closely watched test of how communities respond to massive data center expansion.

Anthropic launches managed agents for enterprise development

Anthropic has introduced Claude Managed Agents, a new tool aimed at helping enterprises build and deploy Artificial Intelligence agents more quickly by handling core infrastructure tasks. The release adds to Anthropic’s recent product push as it competes for a fast-growing enterprise market.

Meta launches muse spark for its apps

Meta has introduced Muse Spark, an in-house large language model designed for its products and positioned as the first in a broader Muse family. The model brings multimodal reasoning, coding, shopping, and recommendation features to the Meta Artificial Intelligence app and website, with wider rollout planned.

Microsoft scales back Copilot in Windows 11 apps

Microsoft is pulling back some Copilot branding and interface elements from core Windows 11 apps after sustained user criticism. Notepad and Snipping Tool are among the latest apps to lose the prominent Copilot button as the company repositions some features.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.