Getting it Right – How Much Air Containment is Enough in Your Data Center?

Credit to Author: Victor Avelar| Date: Tue, 19 Mar 2019 18:00:10 +0000

This is a common question we’ve heard enough times that we decided to test containment across a range of “leakiness” and came up with some practical guidance.

You may have heard the term “percent leakage” as a means of assessing the effectiveness of a hot or cold aisle air containment solution. But what does this actually mean? How would you measure it or validate a particular claim in your own data center? Does this percent leakage spec correlate to your desired data center operating temperature range? You may be paying for a tighter tolerance than what is really needed. Or worse yet, paying for a “tight” solution that doesn’t perform in practice as it did in a test lab. From reading this blog, you’ll learn some practical metrics and practices you can take to ensure that your IT equipment stays cool and your energy bills low.

What is “percent leakage”?

I searched for containment-related documents that included the term % leakage and I found two loose definitions in that they weren’t specific on the criteria. One is percent leakage as the open area (the holes) of a containment surface as a percent the total area. Do you include the racks in the surface area or only the containment structure? As well as top of racks? Is this with no IT equipment installed? This variable significantly changes the percent value.

HyperpodAnother defines percent leakage as the amount of air that “leaks” as a percentage of air required by the IT equipment at a given pressure difference (deltaP) between the hot and cold aisle. By “leak” they mean air that either bypasses the server or is over supplied through the server. Furthermore, these documents specify percent leakage values of no more than 3%, less than 5%, etc.

The variables when measuring percent leakage

On the surface, a percent leakage specification seems logical, but in practice, not so much. Imagine you just finished your containment project and you had to validate a percent leakage value. How would you practically validate that only 3% of the surface area was open? How would you validate that only 5% of air was bypassing the servers? How would you know if some of this “bypass” air was forced through the servers? What if you used a different number of racks than what the specification assumed? The leakage flow rate is directly proportional to the number of racks. Note that the racks tend to be the leakiest element in a fully contained pod. So, there’s no point in installing containment that’s significantly tighter than the racks.

These metrics fail to provide a consistent basis for comparison across all pod configurations. Here are just some of the variables you’ll find in a production environment that complicate a percent leakage validation:

  • Most IT equipment have different fan speeds dependent on % utilization, IT air supply temperature, failure modes, etc.
  • Racks from various vendors (i.e. IBM storage, CISCO FlexPod, etc.) have different airflow characteristics.
  • Room geometry like shape, building columns, ceiling height, etc., all effect airflow patterns
  • Power and data cabling practices cause differences in penetrations through racks and containment systems
  • Jets of high-velocity air can dramatically effect IT inlet temperatures (and localized pressure), regardless of how much containment.
  • Interactions between airflows from different computer room air handling units (CRAH)
  • Whether CRAHs have a turning vane on the air supply plenum
  • Fixed speed vs variable speed CRAH units

Use these practical metrics instead

We propose three practical metrics: the average temperature, the maximum temperature, and temperature variation of the IT supply air at the rack front. Variation is measured as standard deviation from the average (more on this later). You can measure these either through the temperature sensors onboard IT equipment, or through external temperature sensors attached to the front of your racks. I prefer the same type of temperature sensor attached to the front of all racks (top, middle, and bottom) to avoid differences in sensor accuracy. Not only are both these metrics easy to measure in practice, they also reflect the performance of your data center as a system, including the bulleted variables above. And most importantly, they comprehend how much temperature variation your IT equipment can tolerate OR you’re comfortable with. These values can even indicate the likelihood that you are missing a blanking panel somewhere. In a fully-contained pod, you shouldn’t see more than a 2.8°C/5°F difference between your average and maximum IT inlet temperature.

Finding the right temperature range

This is a key indicator for your containment project. The tighter you set your IT supply temperature range (at the front of the rack), the more containment you’ll need (e.g. blanking panels, brush strip, ducting, doors, gaskets, etc.). Of course, if there is some external heat source near your IT rack (e.g. transformer), no amount of containment would reduce that elevated temperature. A few factors that help you establish this range:

Thermal runtime – some people set their IT supply air very low (e.g. 18°C/65°F) allowing them time to set up emergency cooling in case of a cooling system failure. This factor helps establish your minimum IT supply air temperature. The Data Center Temperature Rise Calculator helps you find your thermal runtime before you reach a max temperature.

Cooling system redundancy – if you have 2N cooling system redundancy, chances are you’re willing to increase your minimum IT supply air temperature to something close to the ASHRAE maximum recommended temperature (27°C /80.6°F).

ASHRAE upper limits – this tends to be a good place to start for the upper limit of your temperature range. If you’re willing to go above the ASHRAE maximum recommended, the maximum allowable is 32°C /89.6°F for the A1 equipment class.

Chiller plant efficiency – increasing your IT supply air temperature setpoint allows you to elevate your chilled water temperature, which can increase chiller plant efficiency and economizer hours (if applicable).

Special equipment – you’re constrained if most of your IT equipment requires a very tight operating temperature range, if only a few racks require this, you can locate them in a separately-conditioned room.

More in store for data center containment

That’s a lot of information! And there’s more to come. Look for my next blog where I’ll summarize the containment study and show the results of 5 different levels of containment. We’ll see how each change deviates from the “airtight” average temperature and standard deviation. In the meantime, check out some other blogs I’ve published or leave a comment below to let me know what you think about this one. Also, want to jump ahead? Check out Schneider’s containment architecture that adapts to a variety of cooling and power configurations – HyperPod™.

The post Getting it Right – How Much Air Containment is Enough in Your Data Center? appeared first on Schneider Electric Blog.

http://blog.schneider-electric.com/feed/