How Full Liquid Cooling Is Powering the Next Generation of AI Data Centers.... . . As AI workloads grow, traditional cooling methods are no longer enough. Modern high-performance data centers are now built around full liquid cooling architectures designed to manage the extreme heat generated by advanced AI processors. At the facility level, water from the building cooling system flows into in-row Coolant Distribution Units (CDUs). Inside, a liquid-to-liquid heat exchanger transfers cooling capacity to a secondary fluid that circulates directly to each rack, creating an efficient bridge between facility cooling and IT equipment. Inside every server, a dedicated liquid loop is engineered to match the processor layout and power density of AI hardware. Instead of relying on air, this loop absorbs heat directly from CPUs, GPUs, and memory modules, removing thermal energy at the source. The heated liquid then returns to the CDU, where high-performance heat exchangers move the heat away from the IT space toward the facility cooling system. From there, rooftop chillers or dry coolers reject the heat into the ambient environment. Even in fully liquid-cooled data centers, air still plays a supporting role. Air handlers remove residual heat from components not connected to the liquid loop, creating a balanced ecosystem where liquid handles high-density loads and air maintains room stability. Full liquid cooling is becoming a foundation for AI-ready infrastructure, enabling higher rack densities, better efficiency, and stable performance under extreme compute demand. As a Data Center Operations & Maintenance Engineer, I closely follow how these cooling architectures are transforming operations and facility design. Always happy to connect with professionals working on next-generation, AI-ready data centers. Video copyright: BOYD © Abdullah Mahrous – CC BY 4.0
Cooling System Design Innovations
Explore top LinkedIn content from expert professionals.
Summary
Cooling system design innovations refer to creative and advanced methods for keeping buildings, data centers, and devices at safe temperatures, often using new materials, technologies, or repurposed infrastructure. These approaches aim to handle rising heat loads from modern tech, reduce energy use, and cut carbon emissions.
- Explore liquid cooling: Consider using liquid cooling solutions for high-powered data centers and electronics to manage extreme heat where traditional air cooling falls short.
- Repurpose existing infrastructure: Take advantage of underused spaces like sewer tunnels to create city-wide cooling networks that quietly and efficiently reduce temperatures without visible equipment.
- Upgrade passive solutions: Implement smart paint, optimized fan placement, and ventilated furniture to boost comfort and reduce reliance on energy-hungry air conditioning.
-
-
Breaking the thermal wall with material innovation Performance is now limited by heat as much as logic. Beating the thermal wall demands a materials‑first approach paired with tight electro‑thermo‑mechanical co‑design. What moves the needle - Next‑gen TIMs: liquid‑metal gallium alloys for ultra‑low interface resistance; sintered silver for near‑bulk conductivity and high‑temp stability; phase‑change and graphene/graphite‑enhanced TIMs for thin, reliable bond lines. - Heat spreading ultrathin vapor chambers, pyrolytic graphite sheets, and composite lids (e.g., Cu‑diamond) to flatten hot spots before the sink. - Microchannel cooling: single‑phase cold plates for hundreds of W/cm² with modest ΔP; two‑phase and jet impingement for the highest flux; additive‑manufactured manifolds and fins to unlock flow and surface area. - Package co‑design: direct‑to‑die cooling, embedded spreaders, and low‑CTE, high‑k substrates to manage both heat and warpage. From concept to production - Engineer the interface: flatness, roughness, bondline control, and clamp load dominate real‑world Rθ. - Prove reliability: resist pump‑out, dry‑out, creep, and galvanic effects; ensure coolant/material compatibility. - Model and measure: disciplined compact models and standardized test methods keep simulations honest. How we can help We combine materials science with system co‑design to turn thermal limits into headroom. We have all the Credence design tools and can help with thermal management using the best TIMs and microchannel solutions for your challenging application. Share your power map, allowable pressure drop, and constraints—we’ll deliver a material stack and cooling architecture with modeled junction temps, flow/pressure requirements, and a clear reliability plan.
-
Belgium Converts City Sewers Into Underground Cooling Loops for Data Centers and Supermarkets Beneath the city of Antwerp, engineers in Belgium have begun piping cold water through old municipal sewer tunnels — not to treat waste, but to cool buildings above. It’s part of a radical new geo-loop cooling system that turns underused infrastructure into a city-wide thermal network. Developed by KU Leuven and Hydroscan, the system uses narrow, insulated water lines threaded through decommissioned sewer pipes running beneath central Antwerp. These lines carry naturally cool groundwater and redirect it to surface-level heat exchangers in data centers, supermarkets, and hospitals — where they absorb waste heat without compressors or refrigerants. The cooled facilities then return warm water back into the loop, where it’s gradually dissipated through soil contact or re-cooled underground. Unlike traditional air conditioning systems, there’s no need for chillers or rooftop condensers — just quiet, passive, low-pressure flow driven by small pumps. Initial installations have reduced building cooling costs by 55% and carbon output by 80%. Because the sewer network already exists, the project avoids street excavation and permits fast retrofits. The entire system runs silently beneath people’s feet, cutting heat without any visible equipment. It’s not just smart cooling — it’s recycling the city’s underworld into a climate control system.
-
AWS Builds Custom Liquid Cooling System for Data Centers Amazon Web Services (AWS) is sharing details of a new liquid cooling system to support high-density AI infrastructure in its data centers, including custom designs for a coolant distribution unit and an engineered fluid. “We've crossed a threshold where it becomes more economical to use liquid cooling to extract the heat,” said Dave Klusas, AWS’s senior manager of data center cooling systems, in a blog post. The AWS team considered multiple vendor liquid cooling solutions, but found none met its needs and began designing a completely custom system, which was delivered in 11 months, the company said. The direct-to-chip solution uses a cold plate placed directly on top of the chip. The coolant, a fluid specifically engineered by AWS, runs in tubes through the sealed cold plate, absorbing the heat and carrying it out of the server rack to a heat rejection system, and then back to the cold plates. It’s a closed loop system, meaning the liquid continuously recirculates without increasing the data center’s water consumption. AWS also developed a custom coolant distribution unit, which it said is more powerful and more efficient than its off-the-shelf competitors. “We invented that specifically for our needs,” Klusas says. “By focusing specifically on our problem, we were able to optimize for lower cost, greater efficiency, and higher capacity.” Klusas said the liquid is typically at “hot tub” temperatures for improved efficiency. AWS has shared details of its process, including photos: https://lnkd.in/e-D4HvcK
-
The idea of submerging computer servers in a liquid coolant to cut data center energy consumption by 70% is a breakthrough in sustainable tech innovation. Traditional cooling systems consume significant energy, but with non-conductive liquid coolants, it's possible to safely dissipate heat while keeping electrical circuits dry and operational. This method optimizes thermal management, capturing all the generated heat and drastically reducing the need for conventional fans and chillers. Sandia National Laboratories approach could set a new standard for energy efficiency in data centers, making them greener and more cost-effective. Florian Palatini ++
-
Microsoft reveals a new breakthrough in chip cooling technology! Right now, most AI chips are cooled with “cold plates” - metal blocks that pump liquid across the chip from the outside. It works, but it’s already reaching its limit as AI chips get hotter with every new generation. Microsoft’s new approach goes inside the chip itself. They etched microscopic channels directly into the silicon, letting liquid coolant flow exactly where the heat is. The design was even inspired by nature - shaped like leaf veins to move liquid more efficiently. The results: up to 3x better cooling compared to cold plates, and GPU temperatures dropping by as much as 65%. This means datacenters can run more powerful AI chips, overclock safely, and waste less energy. Cool! Follow Endrit Restelica for more tech stuff.
-
AI is pushing data center cooling into a new era and CDUs are at the center of the conversation. As data centers scale to support next-generation AI platforms, cooling solutions are evolving just as fast. This past week highlighted that shift, with new systems announced to meet the extreme requirements of NVIDIA upcoming Vera Rubin GPU platform and the broader move toward AI Factories. 🔹DCX LIQUID COOLING SYSTEMS introduced a new cooling architecture with its Facility Distribution Unit, a centralized approach that moves CDUs outside the white space and enables cooling at the data hall level. 🔹 Schneider Electric, through its Motivair liquid cooling portfolio, unveiled the MCDU-70, a 2.5 MW modular CDU designed to scale in 10 MW building blocks, closely aligning with NVIDIA’s Omniverse DSX blueprint for AI factories. Why this matters: With rack densities trending toward 600 kW per rack, the industry is seeing more than just incremental change: A move toward centralized and modular cooling architectures Rising investment and consolidation across the liquid cooling ecosystem New entrants pushing the market forward Alongside established vendors, companies like XNRGY Climate Systems are working to challenge traditional approaches with innovative ideas around high-density thermal systems, scalable manufacturing, and faster deployment models. Bottom line: As NVIDIA AI roadmap accelerates, cooling is becoming a first-order design decision for data centers. The next phase of innovation will be driven by both incumbents and disruptors willing to rethink how cooling infrastructure is designed and deployed. #DataCenters #LiquidCooling #AIInfrastructure #CoolingInnovation #XNRGY #DigitalInfrastructure