Liquid cooling to scale in artificial intelligence data centers in 2025

TrendForce projects that liquid cooling adoption in Artificial Intelligence data centers will rise from 14% in 2024 to 33% in 2025 as NVIDIA rolls out GB200 NVL72 rack servers, moving deployments from pilots to large scale.

TrendForce´s latest research finds that the rollout of NVIDIA´s GB200 NVL72 rack servers in 2025 will accelerate upgrades in Artificial Intelligence data centers and drive liquid cooling adoption from early pilot projects to large scale deployment. The firm projects penetration of liquid cooling in Artificial Intelligence data centers to climb from 14% in 2024 to 33% in 2025, and it says growth will continue in subsequent years. The research positions the 2025 hardware refresh as a catalyst that shifts liquid cooling from experimental use toward broader operational deployment.

TrendForce highlights a sharp rise in power consumption for GPUs and ASIC chips used in Artificial Intelligence servers as a primary technical driver for the transition. As an example, the research cites NVIDIA´s GB200 and GB300 NVL72 systems, which it reports have a thermal design power of 130 to 140 kilowatts per rack. TrendForce notes that those power density levels far exceed the capabilities of traditional air cooling systems and that this gap has pushed data center operators to explore alternative cooling approaches.

The research specifically points to early adoption of liquid to air cooling technologies as a response to the higher rack power densities. By identifying the GB200 family and similar high‑density systems as a tipping point, TrendForce frames the 2025 timeframe as the moment when liquid cooling transitions from niche pilots to larger installations. Details beyond the projected penetration rates, the cited thermal design power figures, and the link to liquid to air adoption were not stated in the report excerpt provided.

75

Impact Score

Micron samples 256 GB DDR5 9200 MT/s RDIMM server modules

Micron has begun sampling 256 GB DDR5 RDIMM server modules built on its 1-gamma technology to key ecosystem partners. The company positions the new modules as a higher-speed, more power-efficient option for scaling next-generation Artificial Intelligence and HPC infrastructure.

Microsoft emails show early doubts about OpenAI

Court emails show Microsoft executives were unconvinced by OpenAI’s early Artificial Intelligence progress in 2018 while also worrying that rejecting the lab could push it toward Amazon. The messages reveal internal tension between skepticism over technical claims and concern about competitive and public relations fallout.

Apple explores Intel chip manufacturing alliance

Apple has reached a preliminary agreement with Intel to manufacture some chips for its devices, reflecting mounting pressure on semiconductor supply chains as Artificial Intelligence demand absorbs advanced capacity. The move also aligns with Washington’s push to expand domestic chip production and revive Intel’s foundry business.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.