Skip to content

Data Center Cooling: Liquid Vs Air Cooling

Amid rising thermal management demands in data centers, data center cooling is experiencing a liquid revolution. 

Traditionally, data centers have utilized air-based cooling for their cooling needs. However, liquid cooling — which includes direct to chip and immersion cooling — is gaining favor for a number of reasons.

According to one study published in "Applied Thermal Engineering": "Transistor congestion and rising demand for parallel processing are pushing the thermal design power of microprocessors well beyond 280 W, a limit for air cooling, and are expected to surpass 700 W by 2025."1

More compute and more racks means more heat to dissipate away from critical IT equipment. As anyone in the industry knows, these are significant investments: you want to make sure they're protected so they can keep running properly. 

Data Center Cooling Background

Air cooling has been the dominant method for managing heat in data centers since the earliest days of commercial computing. As far back as the 1950s and 1960s, mainframe rooms relied on chilled air delivered through raised floors and perimeter Computer Room Air Conditioning (CRAC) units. These early facilities were relatively low-density, and the heat produced by hardware could be effectively removed by simply moving sufficiently cool air across the equipment. Through the 1980s and 1990s, as enterprise IT expanded, air cooling remained the standard across corporate data centers, government computing sites, university labs, and colocation facilities. The approach was simple, reliable, and inexpensive compared with more specialized cooling technologies.computer-network-server-room-or-data-center-2025-03-25-20-05-26-utc

Things started to change as the internet as we know it began to boom. 

By the early 2000s, the rise of large-scale internet services pushed data center growth dramatically, and hyperscale operators such as Google, Microsoft, and Amazon refined air-cooling systems to maximize efficiency. Hot-aisle and cold-aisle containment became common, along with more efficient CRAC and CRAH (Computer Room Air Handler) units, outside-air economization, and advanced airflow engineering.

Air cooling worked well for densities up to roughly 10–15 kW per rack, and innovative containment designs helped extend its usefulness even further. As a result, nearly every type of data center — from small on-prem server rooms to massive cloud campuses — continued using air cooling as the primary thermal management strategy.

Even today, air cooling still continues to be used by facilities with lower rack densities. A 2024 Uptime survey showed larger data centers were more likely to have implemented direct liquid cooling techniques than their less dense counterparts, i.e., <5 MW. Furthermore, when asked at what IT rack density would direct liquid cooling become preferable over air cooling, 29% of respondents said that would be the case at densities of 20-29 kW. 

However, as computing density rose in the 2010s and especially the 2020s, air cooling began to reach its practical limits. High-performance computing (HPC), AI training clusters, and GPU-intensive workloads routinely drove rack densities beyond what chilled air could efficiently handle. Attempting to move enough air to cool extremely dense racks required higher fan speeds, greater energy consumption, and more complex airflow engineering.

The result? Diminishing returns.

This pushed operators to explore more efficient alternatives: liquids. 

The gradual transition to liquid cooling began in specialized environments such as supercomputing labs, where cold-plate technology and rear-door heat exchangers provided superior thermal performance. Fast forward today, as AI hardware adoption surges, liquid cooling has gained mainstream acceptance. Direct-to-chip cooling and immersion cooling became increasingly attractive due to their ability to handle very high heat loads while reducing energy use.

Today, air cooling can still be found in smaller data centers, primarily edge data centers.

But liquid cooling is expanding rapidly, particularly in AI-focused and high-density cloud environments.

The industry is now in a hybrid era: air cooling remains essential for general-purpose workloads, while liquid cooling is emerging as the standard for the hottest, most demanding compute tasks.

Benefits of moving from Air-Based to Liquid Cooling for data centers

For buyers, understanding the trade-offs of various data center cooling methods — cost versus installation complexity, safety, maintenance requirements, and operational efficiency — is critical. 

Here are some of the benefits of moving away from air-based cooling to liquid cooling for data center cooling needs: 

  • Higher Efficiency: Liquid cooling removes heat more effectively than air, enabling higher server densities without overheating.
  • Energy Savings: Reduced reliance on air conditioning and fans lowers overall energy consumption, helping cut operational costs.
  • Space Optimization: Because liquid cooling is more efficient, data centers can support more equipment in the same footprint.
  • Consistent Performance: Liquid cooling maintains stable temperatures, reducing the risk of thermal throttling under heavy workloads.
  • Reduced Noise and Dust: With less reliance on air movement, liquid cooling helps minimize particulate buildup and noise levels in the data center.

What else do I need to know about data center cooling? 

If you're looking to upgrade your data center cooling process, there are several other things you need to be thinking about: 

     What About Leaks?

Any time you have a liquid-based solution, leaks are a concern.

However, while leaks are a real risk, they are extremely low probability when:

  • Systems are properly installed
  • Facilities follow standard coolant handling protocols
  • Staff are trained on connectors, manifolds, and PM procedures
  • Dober fluids are engineered to minimize corrosion, reducing long-term leak potential
  • Reinforce: Leak risk is more about process, not the coolant

How About Installation? 

What are the pros and cons of various installation approaches? Below is a brief summary: 

Internal installs

  • Pros: more control, faster response times, in-house expertise
  • Cons: higher staffing cost, need specialized training

Outsourced Installs


  • Pros: turnkey experience, experienced technicians, warranty/guarantees
  • Cons: additional vendor cost, scheduling dependencies
      Dober does not provide installation services directly, but we regularly collaborate with partners and can recommend trusted providers. 

 

    Maintenance, Monitoring and upgrades

       Maintenance, monitoring and upgrades of your cooling system matters more than ever. Here's what you should be doing to ensure your system is in tip-top shape: 

  • Monitoring coolant condition
  • Tracking inhibitor performance
  • Planning fluid refresh cycles
  • Detecting contamination early
  • Sensors + telemetry becoming standard
  • How liquid cooling enables future upgrades:

    • higher chip thermal design power (TDPs)
    • AI cluster expansion
    • rack-level density planning

Traditional Air-Based Cooling vs. Liquid Cooling

To sum it all up, below you'll find a breakdown of data center cooling via air-based solutions versus cooling with liquids, like Dober's COOLWAVE DC data center coolant.

Aspect

Air-Based Cooling

Liquid Cooling (e.g., COOLWAVE DC)

Cooling Efficiency

Less efficient; struggles with high-density heat loads.

More efficient; effectively removes heat even under maximum load conditions

Space Requirements

Requires more space for air circulation and equipment.

Requires less space; more compact design.

Installation Complexity

Simpler installation; well-understood technology.

More complex installation; requires integration with cooling systems.

Maintenance

Requires regular maintenance of air filters and fans.

Requires monitoring of fluid properties and system components.

Energy Consumption

Higher energy consumption due to reliance on fans and air circulation.

Lower energy consumption; more efficient heat transfer.

Scalability

Less scalable; limited by air circulation capacity.

More scalable; can handle increasing heat loads effectively.

 

Liquids are the future of data center cooling

Just like other industries, data center cooling has evolved and continues to evolve as thermal management needs skyrocket. 

Air cooling at one point was the method of choice, but that has changed — except for select smaller operations — as thermal management needs exceed the capacity and economics of air cooling systems. 

The main takeaway for data center operators? Survey your facility and determine which type of liquid cooling works best for your organization's needs. 

If you're planning the next 3–5 years of data center growth, liquid cooling isn't just an upgrade — it's the foundation.

Visit our Dober COOLWAVE page to learn more about what Dober has been doing in the data center cooling space. 

Learn More About Dober COOLWAVE

REFERENCES

1. Mohammad Azarifar, Mehmet Arik, Je-Young Chang, Liquid cooling of data centers: A necessity facing challenges, Applied Thermal Engineering, Volume 247, 2024, 123112, ISSN 1359-4311, https://doi.org/10.1016/j.applthermaleng.2024.123112. (https://www.sciencedirect.com/science/article/pii/S1359431124007804)