Artificial Intelligence growth is about to hit a memory wall

Orders for Artificial Intelligence chips and infrastructure are surging across vendors, but limited memory bandwidth risks slowing deployments and leaving GPU capacity underutilized. Industry executives say technologies such as compute express link and SSD advances can help, but no single cure exists.

Orders for Artificial Intelligence gear are pouring in across the industry, not just at chip giant Nvidia. The article reports that AMD has secured deals with Meta, Microsoft, OpenAI and Oracle. Cisco already has Not stated worth of Artificial Intelligence orders this year, and Broadcom revealed it landed a Not stated contract for racks built on its XPU chips. The surge in compute and infrastructure spend is colliding with another constraint data centers are already wrestling with: power, cooling and networking demands.

Beyond those limits, memory bandwidth is emerging as a central bottleneck for scaling Artificial Intelligence. Experts quoted in the article say GPUs are often restricted by the need to connect to external memory over interconnects that slow data movement, and CPUs also face the same wall. J. Gold Associates founder Jack Gold is quoted saying that bringing memory closer and faster to GPUs yields major performance improvements. ScaleFlux VP of products JB Baker described the mismatch as a gap between how many calculations chips can perform and how much memory bandwidth exists to feed them, comparing it to a massive water source drained through a garden hose.

There is no single cure, but steps are underway. The article notes compute express link, solid state drive advances and vendor efforts such as Nvidia´s NVLink and Storage-Next initiatives as partial solutions. It also names other players, including Intel spin-off Cornelis, working on speeding memory access. Baker argued that on-premises adopters must rebalance capital expenditures to invest in memory, storage and networking as well as GPUs, warning that underinvesting in those areas will waste GPU spend and burn power with idle processors.

72

Impact Score

Generative Artificial Intelligence security coverage on CSO Online

CSO Online’s generative Artificial Intelligence hub tracks how security teams and attackers are using large language models, from agentic Artificial Intelligence risks to malware campaigns and supply chain governance. The section combines news, opinion, and practical guidance aimed at CISOs adapting to rapidly evolving Artificial Intelligence driven threats.

Can Intel return as a serious artificial intelligence compute competitor

The article argues that Intel does not need to beat Nvidia in frontier model training to benefit from the artificial intelligence boom, but must execute well in CPUs, inference, enterprise deployments, and artificial intelligence PCs. It outlines where Intel can realistically compete against Nvidia and AMD by 2026-2027 if its manufacturing and product roadmaps stay on track.

How Intel chief Lip-Bu Tan turned a Trump clash into a government lifeline

Intel CEO Lip-Bu Tan transformed an early morning social media attack from President Trump into a pivotal Oval Office deal that secured a multibillion-dollar equity investment and reshaped the chipmaker’s future. The agreement has boosted Intel’s fortunes while raising new questions about industrial policy, conflicts of interest, and the company’s ability to regain its manufacturing edge in the age of artificial intelligence.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.