Supermicro backs upcoming nvidia vera rubin nvl72 and hgx rubin nvl8 platforms

Supermicro is expanding manufacturing and liquid cooling capabilities to support early delivery of data center solutions built on the upcoming nvidia vera rubin nvl72 and nvidia hgx rubin nvl8 platforms.

Supermicro has announced that it will support the upcoming nvidia vera rubin and rubin platforms with an expanded focus on manufacturing capacity and advanced liquid cooling. The company is working in collaboration with nvidia to target first to market delivery of data center scale systems optimized for the nvidia vera rubin and rubin platforms. Supermicro positions this move as part of its strategy to address the rapid growth in artificial intelligence, cloud, storage, and 5g or edge workloads with tightly integrated infrastructure offerings.

According to the announcement, Supermicro will leverage its accelerated development cycles and long running collaboration with nvidia to rapidly deploy flagship nvidia vera rubin nvl72 and nvidia hgx rubin nvl8 systems. The company highlights its data center building block solutions, or dcbbs, approach as a core differentiator, stating that this modular design methodology enables streamlined production, broad customization options, and faster time to deployment for customers building next generation artificial intelligence infrastructure. The focus on flexibility is intended to serve both hyperscale data centers and large enterprises that require tailored configurations.

Supermicro president and chief executive officer Charles Liang emphasized that the vendor’s agile building block strategy and extended partnership with nvidia allow it to introduce advanced artificial intelligence platforms to market more quickly than competitors. He stated that expanded manufacturing capacity combined with what the company describes as industry leading liquid cooling expertise is meant to help hyperscalers and enterprises roll out nvidia vera rubin and rubin platforms infrastructure at scale with improved speed, efficiency, and reliability. The announcement frames these capabilities as a way for customers to gain a competitive edge as they deploy new artificial intelligence driven services.

55

Impact Score

How Artificial Intelligence is reshaping media insights

Artificial Intelligence is becoming a core tool for media companies seeking deeper audience understanding, better personalization, and more efficient operations. Its growing role spans content creation, advertising, moderation, forecasting, and decision support, while raising important questions around privacy, bias, and misinformation.

Access blocked for source content

The source content could not be accessed because the feed returned an authorization notice instead of article text. No verified details beyond the provided headline and snippet are available.

Making Artificial Intelligence work in clinical trials

Rob DiCicco of Transcelerate Biopharma Inc outlines why Artificial Intelligence adoption in clinical trials differs sharply from preclinical research and development. He highlights advances in trial design alongside the scientific, regulatory, and ethical standards these tools must meet.

AMD Ryzen Artificial Intelligence 5 PRO 440G specs

AMD’s Ryzen Artificial Intelligence 5 PRO 440G is a desktop processor in the Ryzen Artificial Intelligence PRO 400 family, combining 6 cores, 12 threads, integrated Radeon 840M graphics, and an NPU rated at up to 50 TOPS. It targets Socket AM5 systems with DDR5, PCI-Express Gen 4, and AMD PRO manageability features.

AMD medusa point chip appears in geekbench

An unannounced AMD Medusa Point mobile processor has surfaced in Geekbench, offering an early look at a Zen 6-based design with 10 cores and a larger cache configuration. The listing points to active pre-release testing, but the benchmark results remain too immature to judge performance.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.