{"id":15763,"date":"2019-07-11T11:00:30","date_gmt":"2019-07-11T19:00:30","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2019\/07\/11\/news-9510\/"},"modified":"2019-07-11T11:00:30","modified_gmt":"2019-07-11T19:00:30","slug":"news-9510","status":"publish","type":"post","link":"https:\/\/www.palada.net\/index.php\/2019\/07\/11\/news-9510\/","title":{"rendered":"It\u2019s Not Just About Chip Density \u2013 Five Reasons to Consider Liquid Cooling for Your Data Center"},"content":{"rendered":"<p><strong>Credit to Author: Wendy Torell| Date: Thu, 11 Jul 2019 16:45:00 +0000<\/strong><\/p>\n<p><a href=\"https:\/\/blog.se.com\/datacenter\/2018\/12\/13\/liquid-cooling-servers-data-center-design\/\" target=\"_blank\" rel=\"noopener noreferrer\">Liquid cooling<\/a> is not a new technology. It\u2019s been around for decades and has historically focused on mainframes, high performance computing (HPC), and gaming applications. But today\u2019s demands for IoT, artificial intelligence, machine learning, big data analytics, and edge applications is once again bringing it into the limelight for <a href=\"https:\/\/www.schneider-electric.com\/en\/work\/solutions\/for-business\/data-centers-and-networks\/reference-designs\/\" target=\"_blank\" rel=\"noopener noreferrer\">data center design<\/a>. We\u2019re hearing more and more in the media about liquid cooling for data centers, primarily because servers are demanding high-power GPUs and CPUs to meet their business\u2019 processing needs. These chips are reaching thermal design power (TDP) of 400W now. When a rack is heavily populated with these kinds of servers, the rack density can exceed levels that are cost effective and practical for air-cooling. <a href=\"https:\/\/www.thegreengrid.org\/\" target=\"_blank\" rel=\"noopener noreferrer\">The Green Grid<\/a> suggests a range of 15-25kW\/rack as the limit for air cooled racks \u201cwithout the use of additional cooling equipment such as rear door heat exchangers.\u201d<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone\" src=\"https:\/\/coschedule.s3.amazonaws.com\/167392\/0f325049-b4b6-406c-8377-1253f5bd1f7f\/1562775284870.png\" alt=\"graph depicting GPU vs CPU TDP Trend for data center design\" width=\"653\" height=\"291\" \/><br \/> Source: Alibaba<\/p>\n<h2>Research-backed Analysis of Liquid Cooling for a Data Center Design<\/h2>\n<p>No doubt, these rising chip and rack densities are a key driver for liquid cooling. But there are other reasons you may want to consider the technology. My colleagues Paul Lin and Tony Day recently <a href=\"https:\/\/www.schneider-electric.us\/en\/download\/document\/APC_WTOL-B9RKEA_EN\/\" target=\"_blank\" rel=\"noopener noreferrer\">released a white paper<\/a> that discussed this very topic. In this paper, they discuss five reasons to adopt liquid cooling. We\u2019ve already discussed the first, so here are the remaining four:<\/p>\n<ol>\n<li>Pressure to reduce energy consumption<\/li>\n<li>Space constraints<\/li>\n<li>Water usage restrictions<\/li>\n<li>Harsh IT environments<\/li>\n<\/ol>\n<p>&nbsp;<\/p>\n<blockquote>\n<p style=\"text-align: center;\"><a href=\"https:\/\/www.schneider-electric.us\/en\/download\/document\/APC_WTOL-B9RKEA_EN\/\" target=\"_blank\" rel=\"noopener noreferrer\">White Paper<\/a>:<br \/> Five Reasons to Adopt Liquid Cooling<\/p>\n<p>&nbsp;<\/p>\n<\/blockquote>\n<h2>Pressure to Reduce Energy Consumption<\/h2>\n<p>Energy consumption of data centers represents a growing percentage of our global energy. This has prompted regulations and corporate initiatives requiring power usage effectiveness (PUE) and overall energy consumption reductions. Next to the IT systems themselves, cooling system energy is the biggest energy consumer in data centers. \u00a0Liquid cooling has been proven as a more efficient cooling approach than conventional air cooling, in part because of a significant reduction in IT fan energy, ranging from 4-15%. Our preliminary analysis suggests overall energy reduction of over 10% with immersive liquid cooling, as compared to a conventional packaged chiller-cooled data center. With numbers like this, it\u2019s an architecture that shouldn\u2019t be ignored.<\/p>\n<h2>Space Constraints<\/h2>\n<p>It is important to consider the amount of physical space needed to house not only the IT equipment, but the cooling infrastructure that supports it. As densities rise, rack count may go down, but the ratio of physical space dedicated to air cooling equipment increases, diminishing the gains of the higher density racks. With liquid cooling, you have an opportunity to reduce the overall data center footprint for a given IT load through significant compaction. This can be a significant benefit for large data centers or colocation providers that may have a desire to expand in space constrained regions like Singapore and Hong Kong.<\/p>\n<h2>Water Usage Restrictions<\/h2>\n<p>With conventional air cooling, high volumes of water are often used for evaporative cooling to achieve PUEs in the sub-1.2 range. A 20MW data center consumes the equivalent water of 2,500 people. That\u2019s pretty significant! Not only does water consumption increase operational costs, but many local municipalities are putting pressure on the data center industry in geographies with water resource constraints. Liquid cooling reduces, and often eliminates, water usage from the cooling system. Since most liquid cooling approaches use warm water directly to the IT, simple dry coolers can be used in most climates to reject the heat.<\/p>\n<h2>Harsh IT Environments<\/h2>\n<p>More and more, we are seeing IT equipment deployed in non-ideal edge environments \u2013 IoT in manufacturing facilities, warehouses, distribution facilities, industrial applications. These environments often present challenges in terms of airborne contaminants, the ambient conditions, and the quality of the power. When standard IT is deployed in these conditions, this can result in lower reliability than anticipated. As the IT becomes more integrated with manufacturing and other processes, downtime can have a big impact on the bottom line. Ruggedized enclosure solutions exist with integrated air cooling, but depending on the environment, can be less efficient and costlier. Liquid cooling represents an alternative that separates the servers from the environment. With certain liquid cooling approaches, fans are removed, and airborne contaminants are completely isolated from the IT equipment.<\/p>\n<h2>And yet, even more benefits of liquid cooling.<\/h2>\n<p>The paper mentions some additional benefits of liquid cooling as well. These may not drive people to switch from air cooled to liquid cooled, but they represent important advantages once you switch.<\/p>\n<ul>\n<li><strong>Minimal heat added to the space<\/strong> \u2013 With immersive liquid cooling, over 95% of the heat is removed, meaning a comfortable working environment in the IT space.<\/li>\n<li><strong>Fans are eliminated<\/strong> \u2013 Not only does this mean less energy as discussed earlier, but this eliminates health risks caused from fan noise, and also reduces risk of IT failures caused by fan failures.<\/li>\n<li><strong>Waste heat recovery<\/strong> \u2013 The hot water used to remove the heat from the chips provides practical recovery of waste heat which can be used for facility or district heating. This can have a significant impact on the opex and overall carbon footprint of the facility.<\/li>\n<li><strong>Layout flexibility<\/strong> \u2013 With air-cooled IT equipment, hot\/cold aisle arrangement with containment is best practice for airflow management. Liquid cooling provides more flexibility to arrange equipment.<\/li>\n<li><strong>Geography not as important<\/strong> \u2013 Since liquid cooling uses warm water, full economization can be achieved in most parts of the world.<\/li>\n<\/ul>\n<p>Read Paul &amp; Tony\u2019s white paper, <em><a href=\"https:\/\/www.schneider-electric.us\/en\/download\/document\/APC_WTOL-B9RKEA_EN\/\" target=\"_blank\" rel=\"noopener noreferrer\">Five Reasons to Adopt Liquid Cooling<\/a><\/em>, to get more details on what I\u2019ve highlighted here. I am convinced that liquid cooling will become more mainstream for data centers and <a href=\"https:\/\/www.apc.com\/us\/en\/solutions\/business-solutions\/edge-computing\/\" target=\"_blank\" rel=\"noopener noreferrer\">edge computing<\/a> in the future. Are you?<\/p>\n<p>Leave a comment below or check out other blog posts from the <a href=\"https:\/\/blog.se.com\/tag\/data-center-science-center\/\" target=\"_blank\" rel=\"noopener noreferrer\">Data Center Science Center<\/a> team.<\/p>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/blog.se.com\/datacenter\/2019\/07\/11\/not-just-about-chip-density-five-reasons-consider-liquid-cooling-data-center\/\">It\u2019s Not Just About Chip Density \u2013 Five Reasons to Consider Liquid Cooling for Your Data Center<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/blog.se.com\">Schneider Electric Blog<\/a>.<\/p>\n<p><a href=\"https:\/\/blog.se.com\/datacenter\/2019\/07\/11\/not-just-about-chip-density-five-reasons-consider-liquid-cooling-data-center\/\" target=\"bwo\" >http:\/\/blog.schneider-electric.com\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><strong>Credit to Author: Wendy Torell| Date: Thu, 11 Jul 2019 16:45:00 +0000<\/strong><\/p>\n<p>Liquid cooling is not a new technology. It\u2019s been around for decades and has historically focused on mainframes, high performance computing (HPC), and gaming applications. But today\u2019s demands for IoT,&#8230;  <a href=\"https:\/\/blog.se.com\/datacenter\/2019\/07\/11\/not-just-about-chip-density-five-reasons-consider-liquid-cooling-data-center\/\" title=\"ReadIt\u2019s Not Just About Chip Density \u2013 Five Reasons to Consider Liquid Cooling for Your Data Center\">Read more &#187;<\/a><\/p>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/blog.se.com\/datacenter\/2019\/07\/11\/not-just-about-chip-density-five-reasons-consider-liquid-cooling-data-center\/\">It\u2019s Not Just About Chip Density \u2013 Five Reasons to Consider Liquid Cooling for Your Data Center<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/blog.se.com\">Schneider Electric Blog<\/a>.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[12389,12388],"tags":[20099,12391,12459,12500,14591,19002,12616,12395,15444,22246,22247,6269,20453],"class_list":["post-15763","post","type-post","status-publish","format-standard","hentry","category-scadaics","category-schneider","tag-cpu","tag-data-center","tag-data-center-design","tag-data-center-planning","tag-data-center-science-center","tag-dcsc","tag-edge","tag-edge-computing","tag-gpu","tag-harsh-environments","tag-high-density","tag-internet-of-things","tag-liquid-cooling"],"_links":{"self":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/15763","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=15763"}],"version-history":[{"count":0,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/15763\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=15763"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=15763"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=15763"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}