Networking for Artificial Intelligence: building the foundation for real-time intelligence

At the 2025 Ryder Cup, HPE deployed an on-site private-cloud network and operational dashboard to support real-time decisions for nearly a quarter million attendees. The deployment highlights how networking must evolve to support Artificial Intelligence inference, edge processing, and self-driving network operations.

The 2025 Ryder Cup at Bethpage Black in Farmingdale, New York, offered a high-profile test of networking built for Artificial Intelligence. Nearly a quarter million spectators and a flood of connected devices required a central operations hub created by HPE. That Connected Intelligence Center ingested ticket scans, weather reports, GPS-tracked golf carts, concession and merchandise sales, spectator queues, and network performance feeds, and it combined inputs from 67 Artificial Intelligence-enabled cameras into a private-cloud dashboard to give staff an instantaneous operational view. Jon Green, CTO of HPE Networking, framed the deployment as proof that “disconnected Artificial Intelligence doesn’t get you very much; you need a way to get data into it and out of it for both training and inference.”

Engineers addressed the venue’s density and mobility challenges with a two-tiered architecture. A front-end layer of more than 650 WiFi 6E access points, 170 network switches, and 25 user experience sensors collected live video and movement data, while a back-end layer in a temporary on-site data center linked GPUs and servers in a high-speed, low-latency configuration that served as the system’s brain. That back end fed a private-cloud Artificial Intelligence cluster for live analytics and allowed models to process footage and surface the most interesting shots. The article emphasizes that networks for Artificial Intelligence must deliver ultra-low latency, lossless throughput, and adaptability at scale because inference workloads are gated by the slowest calculation in the pipeline.

The piece also situates the Ryder Cup example within broader industry trends. An HPE cross-industry survey of 1,775 IT leaders found 45 percent can run real-time data pushes and pulls now, up from 7 percent in 2024, yet many organizations still struggle to operationalize data pipelines. The rise of physical Artificial Intelligence is prompting operational repatriation to edge and on-premises clusters for faster, safer inference in contexts like self-driving vehicles and factory floors. HPE’s telemetry practice, which processes more than a trillion telemetry points daily, feeds AIOps models that already surface recommendations and may someday enable self-driving networks that automate routine fixes and mass configuration changes. The article concludes that network performance increasingly defines business performance and that building inference-ready networks will separate pilots from scaled Artificial Intelligence deployments.

58

Impact Score

Nvidia acquisition of SchedMD raises Slurm neutrality concerns

Nvidia’s purchase of SchedMD has given it control of Slurm, an open-source scheduler that sits at the center of many supercomputing and large-model training systems. Researchers and engineers are watching for signs that support could tilt toward Nvidia hardware over AMD and Intel alternatives.

Mustafa Suleyman says Artificial Intelligence compute growth is still accelerating

Mustafa Suleyman argues that Artificial Intelligence development is being propelled by simultaneous advances in chips, memory, networking, and software efficiency rather than nearing a hard limit. He contends that rising compute capacity and falling deployment costs will push systems beyond chatbots toward more capable agents.

China and the US are leading different Artificial Intelligence races

The US leads in large language models and advanced chips, while China has built a major advantage in robotics and humanoid manufacturing. That balance is shifting as Chinese developers narrow the gap in model performance and both countries push to combine software and machines.

Congress weighs Artificial Intelligence transparency rules

Bipartisan lawmakers are pushing a federal transparency standard for the largest Artificial Intelligence models as Congress works on a broader national framework. The proposal aims to increase public trust while avoiding stricter state-by-state requirements and heavier regulation.

Report finds California creative job losses are not driven by Artificial Intelligence

New research from Otis College of Art and Design finds California’s recent creative industry job losses stem from cost pressures and structural shifts, not direct worker displacement by generative Artificial Intelligence. The technology is changing workflows and expectations, but it is largely replacing tasks rather than entire jobs.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.