AMD said the agentic Artificial Intelligence era is pushing CPU usage higher in compute infrastructure and changing how compute nodes are configured. The company described a shift away from CPUs serving mainly as host processors for GPU-heavy systems toward a model where CPUs play a more active role in running and coordinating agent-driven workloads.
Lisa Su said that in the past, the CPU to GPU ratio was primarily as a host node in like a 1:4 or 1:8 configuration node, now changing and getting closer to a 1:1 configuration. She added that if deployments use lots and lots of agents, it is possible to imagine more CPUs than GPUs in a node.
The change is tied to how large language models interact with host processors in agentic Artificial Intelligence systems. Instead of CPUs mainly initiating GPU operations for training and inferencing, the host CPU is increasingly used for continuous updates and orchestration of agents. That makes the CPU more central to system behavior as Artificial Intelligence workloads become more agentic.
AMD framed the comments during its first-quarter 2026 earnings discussion, presenting the trend as a notable architectural change in modern compute nodes. The company’s view suggests that demand for CPUs could rise alongside accelerator demand as agentic Artificial Intelligence systems require more coordination and ongoing host-side processing.
