Understanding data centre cooling methods

One of the most critical tasks in any data centre is undoubtedly data centre cooling. Data centres consume a lot of power which translates into heat and requires cooling in order to safeguard the data and the servers they host. High temperatures or humidity run the risk of damaging IT equipment, or worse a fire, so sufficient steps must be taken to minimize the risk of heat and power surges and protect valuable infrastructure.

How important is cooling for data centres?

The importance of data centre cooling is rising as the data centre market continues to grow rapidly. Global sales of big data market are expected to surpass US$ 279bn and register a CAGR of 14.3% in the forecast period 2022-2032. For years, the quantity of power an average rack required steadily increased. Now, however, rack density needs are beginning to expand at much faster rates.

A report from AFCOM suggests the average rack density increased by nearly an entire kilowatt (kW) in 2020, jumping to 8.2 kW per rack from 7.3 kW in 2019. Moreover, while around 29% of data centres in 2020 reported an average density of 10 kW or higher, by 2025 some of these even anticipate a density of 15 to 20 kW per rack will become the norm.

Given the growth in rack density and escalating levels of power usage, data centre cooling has never been more important.

What is data centre cooling?

Data centre cooling is a method of controlling the temperature inside a data centre to reduce heat. Typically, this is achieved through a combination of equipment, tools, systems, techniques and processes to guarantee ideal temperatures and humidity levels are maintained at all times.

Data centre cooling systems will help guide the flow of heat and cooling around the data centre to achieve maximum efficiency, regulating parameters including temperatures, cooling performance, energy consumption and cooling fluid flow characteristics.

Intelligent cooling systems are connected to networks of sensors throughout each data centre, measuring the performance of the customers’ hardware as well as the ambient conditions inside and outside of the building. The cooling is intelligently modulated to suit the conditions within the data centre and as these systems become increasingly intelligent, they are working harder and faster at achieving the best possible cooling efficiency.

Why is data centre cooling important?

A failure to manage the heat and airflow inside a data centre can negatively impact businesses and their performance. It can lead to damage to equipment, system failure or data loss and costs the business more in terms of money and resource.

To achieve sufficient cooling, the temperature within data centres needs to be maintained 24×7. This requires a great deal of energy and data centre operators are challenged to continually reduce consumption. Energy efficiency can be diminished because of the volume of resources that need to be dedicated to keeping temperature down. Without an effective data centre cooling approach, the risk of overheating significantly increases.

The high costs associated with data centre cooling infrastructure is why many companies have decided to move away from on-premise data centres to colocation data centres like Telehouse. Data centres can spend up to 40% of their total energy consumption on cooling and as a result, efficient cooling is essential for reducing operational costs. Yet, most on-premise data centres lack efficiency or the monitoring capabilities to fully optimise infrastructure for cooling demands.

Benefits of data centre cooling

  • Longer technology lifespan: Data centre cooling makes hardware more durable and reduces the need for organisations to spend significant sums replacing infrastructure.
  • Greater data centre efficiency: Flexible liquid cooling technologies, or air-cooling systems that can quickly change cold air usage can help improve efficiency and address hot spots quickly.
  • Guaranteed server uptime: The most effective data centre cooling technologies enable servers to stay online longer and reduce the issues of overheating.

Types of cooling methods

  • Air cooling: with this approach, cold air is blown across or circulated through the hardware, effectively exchanging warmer air with cooler air and dispersing the heat.
  • Liquid cooling: utilises water to cool data centre servers. Using a Computer Room Air Handler (CRAH) remains a popular way of combining liquid and air cooling. Liquid cooling significantly reduces liquid consumption compared with many air-cooling systems.

Specific data centre cooling methods

  • Cold and hot aisles: Cold and hot aisle containment leverages alternating rows of “cold aisles” and “hot aisles.” Hot aisles consist of the air exhausts on the rear of the racks while a cold aisle has cold air intakes at the front. As part of the approach, hot aisles expel hot air into the air conditioning intakes, which are then chilled and vented into the cold aisles. Empty racks are full of blanking panels to prevent wasted cold air or over-heating.
  • Liquid immersion cooling: Liquid immersion cooling involves IT components and other electronics, including complete servers and storage devices being submerged in a thermally-conductive but also electrically-insulating dielectric liquid or coolant. The approach enhances server reliability and lessens maintenance and service calls, because servers are protected from air pollutants including dust and from the oxygen and moisture that causes corrosion.
  • Raised floor: Many data centres use this approach. Raised flooring systems are composed of panels supported on steel pedestals todeliver cold airto servers through the sub-floor.
  • Computer Room Air Conditioner (CRAC): This is a device that monitors and maintains the temperature, air distribution and humidity in a data centre. Units are like conventional air conditioners driven by a compressor that draws air across a refrigerant-filled cooling unit.
  • Computer Room Air Handler (CRAH): A computer room air handler (CRAH) is a device used to deal with the heat produced by data centre equipment. A CRAH uses fans, cooling coils and a water chiller system to remove heat.
  • In-rack heat extraction: This method directly extracts the heat generated by servers, pumping it outside and cooling the hardware using compressors and chillers located inside the rack. The main disadvantage with this approach is that it limits computational density per rack, which restricts the levels of data centre power that can be included on a single floor.
  • Chilled water system: Mid-to-large data centres commonly make use of a chilled water approach. This involves leveraging heated water to cool air being brought in by air handlers (CRAHs). Water is supplied by a chiller plant located within the data centre itself.
  • Critical cooling load: This measurement represents the complete usable cooling capacity on the data centre floor to cool servers. The term does not include outside the data centre or server room floor.
  • Evaporative cooling: This method of working manages temperature by exposing warm air to water, which causes the water to evaporate and draws the heat out of the air. The approach is starting to gain traction in data centres despite the extra humidity monitoring it requires.
  • Calibrated vectored cooling (CVC): This is a type of data centre cooling specifically designed to be used with high-density servers. It optimises the airflow path via equipment to allow the cooling system to handle heat more efficiently. This in turn, enables users to grow the ratio of circuit boards per server chassis and ultimately to utilise fewer enthusiasts.

Data centre cooling with Telehouse

Telehouse continues to continue to operate in the most efficient way to best serve its customers. We embed environmentally sustainable best practices in all our operations and constantly strive to adopt the highest standards to enhance energy efficiency, foster green procurement and reduce our carbon footprint.

To improve cooling within our data centres, Telehouse utilises market leading cooling technology within its data centres to improve efficiency and reduce the amount of electricity required. We also pioneered the world’s first multi-storey indirect adiabatic cooling system, which is designed to a BREEAM Excellent standard, and located in the Telehouse North Two data centre.

Since 2019, Telehouse has powered its operations with 100% renewable energy and we are working towards a plan to deliver net zero operations. Telehouse is also fully compliant with the Energy Savings Opportunity Scheme (ESOS) and holds environmental permits that regulate emissions and air quality for combustion plants.

In November 2017, we entered into the European Union scheme for Emissions Trading which is set up to monitor and measure CO2 emissions from fuel consumption and we actively report emissions data as required by the Streamlined Energy and Carbon Reporting (SECR). Telehouse is also now considered an ultra-small emitter under the UK Emissions Trading Scheme, demonstrating the company’s commitment to reducing carbon emissions.

To find out more about our sustainability commitments, visit: Environmentally Responsible Colocation

More Data Centres
Data centre excellence is at the heart of what we do. Over 30 years of experience has made us true experts in migration, network and infrastructure data centre services. To learn more, read our in-depth articles.
The benefits of AI and Robotics in IT infrastructure
Utilising AI and robotics in the IT infrastructure and the cloud computing industry is becoming more and more prominent. Learn more about the benefits with our article today
The role of AI and Robotics in IT infrastructure
AI and robotics are increasingly becoming more important to IT infrastructure and the cloud computing industry. Learn more with our article post today!
The impact of hybrid working on the IT infrastructure industry
Hybrid working has had an effect on all of us since the pandemic. Learn about the wider impact on IT infrastructure here.