Liquid cooling to scale in artificial intelligence data centers in 2025

TrendForce projects that liquid cooling adoption in Artificial Intelligence data centers will rise from 14% in 2024 to 33% in 2025 as NVIDIA rolls out GB200 NVL72 rack servers, moving deployments from pilots to large scale.

TrendForce´s latest research finds that the rollout of NVIDIA´s GB200 NVL72 rack servers in 2025 will accelerate upgrades in Artificial Intelligence data centers and drive liquid cooling adoption from early pilot projects to large scale deployment. The firm projects penetration of liquid cooling in Artificial Intelligence data centers to climb from 14% in 2024 to 33% in 2025, and it says growth will continue in subsequent years. The research positions the 2025 hardware refresh as a catalyst that shifts liquid cooling from experimental use toward broader operational deployment.

TrendForce highlights a sharp rise in power consumption for GPUs and ASIC chips used in Artificial Intelligence servers as a primary technical driver for the transition. As an example, the research cites NVIDIA´s GB200 and GB300 NVL72 systems, which it reports have a thermal design power of 130 to 140 kilowatts per rack. TrendForce notes that those power density levels far exceed the capabilities of traditional air cooling systems and that this gap has pushed data center operators to explore alternative cooling approaches.

The research specifically points to early adoption of liquid to air cooling technologies as a response to the higher rack power densities. By identifying the GB200 family and similar high‑density systems as a tipping point, TrendForce frames the 2025 timeframe as the moment when liquid cooling transitions from niche pilots to larger installations. Details beyond the projected penetration rates, the cited thermal design power figures, and the link to liquid to air adoption were not stated in the report excerpt provided.

75

Impact Score

Adobe advances edge delivery and artificial intelligence in experience manager evolution

Adobe is recasting experience manager and edge delivery services as a tightly connected, artificial intelligence driven platform for intelligent content orchestration and ultra-fast web delivery. A recent two-day developer event in San Jose showcased edge native architecture, agentic workflows, and automated content supply chains that target both authors and developers.

Artificial intelligence initiatives at argonne national laboratory

Argonne national laboratory is expanding its artificial intelligence research portfolio, from next generation supercomputing partnerships to urban digital twins and nuclear maintenance frameworks. A series of recent press releases and feature stories outlines how artificial intelligence is being integrated across scientific disciplines and large scale facilities.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.