It’s Time to Consider Additional Cooling Options for Open Compute Project Data Centers

Credit to Author: John Niemann| Date: Wed, 23 Aug 2017 15:01:27 +0000

The Open Compute Project is a topic interest in IT circles, as the reference architecture for data centers offer the potential to provide efficient, flexible and scalable compute power at a relatively low cost. But since Facebook first announced the architecture in 2011, one area has yet to be revisited: cooling.

As Facebook originally envisioned it for its Prineville, Ore. data center, the architecture uses direct air economization for cooling. That works well in cooler climates like Oregon, where you can often take advantage of cool outside air. But if customers consider additional cooling options, they’ll find they’re able to build OCP-compliant data centers in other, much warmer climates, as well as improve performance and minimize risk.

Indirect air economizer cooling, for example, greatly extends the geographies in which air economizer modes of cooling can be used. With indirect air economizer cooling, a heat exchanger is used to isolate indoor from outdoor air.  This acts as a barrier between the indoor and outdoor environments, so you’re not as susceptible to humidity levels outside, thus enabling the economizer to run more hours to get the benefit of higher efficiency.

When combined with evaporative cooling you can further extend the hours of operation by leveraging the wet bulb temperature even on humid days to provide partial cooling, meaning you don’t need colder ambient temperatures to bring the data center to an acceptable temperature. Once again, you get more hours of economizer use over the course of the year, thus increasing overall efficiency.

What’s more, with direct systems, there’s always a chance that poor air quality or high humidity conditions – such as from a fire or a thunderstorm – would cause you to have to turn the economizer off. That means you need to size the cooling compressor such that it can handle 100% of the data center load if needed. With an indirect air economizer, because you’re minimizing that risk of air contamination and separating outdoor air from inside, you only need a compressor that can help you get through the hottest days of the year, saving you money on capital expenses.

And those hot days may be fewer than you think. The OCP specifies data center operating temperatures in the range of 65° to 85° F. The idea was basically to match data center temperature to whatever the temperature outside is. If it’s cool enough outside to operate at 65° F, then fine. If not, it’s OK to let the temperature rise to as much as 85° F. The idea is to not use additional energy make the data center any cooler than 85° F if you don’t have to.

ASHRAE TC 9.9 several years ago published the third edition of its Thermal Guidelines for Data Processing Environments which outlined server reliability rates at various temperatures. The upshot was that servers, especially newer ones, can handle operating temperatures far higher than most data centers were using at the time, and probably still.

What’s more, we now know that in terms of reliability, operating IT gear at cooler temperatures offsets the times when you operate at higher temps, an issue I covered in this previous post. So, if you are in a cooler climate and can operate at, say 65° F at times during the winter, that will offset the hours you operate at 85°F in the summer.

The point is, if you’re interested in the OCP data center architecture, don’t limit yourself to the direct air economization described in the guidelines. You’ve now got better options, notably indirect air economization. (Schneider Electric can help in that regard, with the new Ecoflair Air Economizer).

To learn more about the topic, check out our free white paper no. 215, “Choosing Between Direct and Indirect Air Economization for Data Centers.” It’ll be worth your while, because there’s real money to be saved on your data center cooling tab. Also, try the Economizer Mode PUE Calculator and see what works best for your data center.

The post It’s Time to Consider Additional Cooling Options for Open Compute Project Data Centers appeared first on Schneider Electric Blog.

http://blog.schneider-electric.com/feed/