NVIDIA links data centers into a unified Artificial Intelligence supercomputer with Spectrum-XGS Ethernet

NVIDIA unveiled Spectrum-XGS Ethernet to interconnect multiple geographically separated data centers into a single giga-scale Artificial Intelligence super-factory. The platform promises distance-aware networking that delivers predictable low-latency performance across campuses, cities, and continents.

Data center networking is central to distributed computing and to future Artificial Intelligence workloads that may span millions of GPUs. NVIDIA introduced Spectrum-XGS Ethernet as an extension of its Spectrum-X networking platform designed to link multiple, geographically separated data centers into a unified, giga-scale Artificial Intelligence super-factory. The company said Spectrum-XGS removes the capacity limits of single facilities by adding distance-aware networking, which aims to provide predictable, low-latency performance across campuses, cities, and continents.

The changes are delivered primarily through software and firmware updates to existing Spectrum-X switches and ConnectX SuperNICs rather than through new silicon. Spectrum-XGS includes auto-adjusted congestion control tuned for long-haul links, precise latency management to reduce jitter, and comprehensive end-to-end telemetry. That telemetry is intended to allow operators to visualize and control network traffic across multiple sites, giving visibility into cross-facility flows and making behavior across long distances more predictable for distributed workloads.

NVIDIA reported measurable performance improvements from the updates, saying Spectrum-XGS nearly doubles NCCL throughput for multi-GPU, multi-node training jobs and large-scale experiments. Those gains are presented as efficiency improvements for distributed Artificial Intelligence training and inference. NVIDIA positioned the technology as a new axis of growth for infrastructure, following scale-up inside servers and scale-out inside data centers with a new scale-across approach that connects facilities into unified compute fabrics as demand for massive distributed compute grows.

72

Impact Score

What businesses need to know about the EU cyber resilience act

The EU cyber resilience act is turning product cybersecurity into a legal requirement for companies that sell digital products into the European Union. A key compliance milestone arrives in September 2026, well before the full regulation takes effect in 2027.

Claude Mythos and cyber insurance’s next inflection point

Claude Mythos is being treated by governments and regulators as a potential systemic cyber risk with implications for financial stability and insurance markets. Its emergence is intensifying pressure on insurers to clarify whether Artificial Intelligence-enabled cyber losses are covered, excluded, or require new stand-alone products.

OpenAI expands ChatGPT ads with self-serve manager

OpenAI is widening its ChatGPT ads pilot with a beta self-serve Ads Manager, new bidding options and broader measurement tools. The push signals a deeper move into advertising as the company expands the program into several international markets.

OpenAI launches Artificial Intelligence deployment consulting unit

OpenAI has created a new consulting and deployment business aimed at helping enterprises build and roll out Artificial Intelligence systems. The move mirrors a similar push by Anthropic and signals a broader effort by model providers to capture more of the enterprise services market.

SK Group warns DRAM shortages could curb memory use

SK Group chairman Chey Tae-won warned that customers may reduce memory consumption through infrastructure and software optimization if DRAM suppliers fail to raise output. Demand from Artificial Intelligence data centers is keeping the market tight as memory makers weigh expansion against the long timelines for new fabs.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.