Energy Efficiency Requirements for Edge Computing

Credit to Author: Steven Carlini| Date: Wed, 18 Dec 2019 17:00:00 +0000

Subsequent to participating in the DCD plenary panel on how the data centre industry must now respond to the global climate emergency, I spoke with Dave Johnson, Executive Vice President, Secure Power Division at Schneider Electric and he shared with me further insights into energy use in distributed IT and edge data centre applications.

The International Energy Agency (IEA) cites Data Centres and Networks technologies as being on track with the IEA’s Sustainable Development Scenario (SDS). This should be taken as encouraging, since it offers a pathway for the global energy system to achieve three strategic goals: the Paris Agreement’s well below 2°C climate goal, universal energy access and a substantial reduction in air pollution. 1

The IEA’s current assessment surmises that “sustained efforts by the ICT industry to improve energy efficiency, as well as government policies to promote best practices, will be critical to keep energy demand in check over the coming decades.”

Dave told me that he views the potential to meet future energy efficiency requirements of IT systems requires more of a focus for edge computing. “I actually think that’s the place we should spend more attention, so it’s time to think about greening the edge if you will, and there’s a couple of different angles to that… We want to make sure that some of the same attention that we’ve given to big data centers to micro data centers.”

Metrics to Measure for the Edge

PUE (Power Usage Effectiveness), an energy efficiency metric originated by The Green Grid and widely applied in the established data center market, has questions about its applicability to edge. Dave explained that using PUE, or an equivalent developed for a micro data center, could encourage owners and operators of edge facilities to achieve the same kind of efficiencies as are already being achieved in large and hyperscale data centers. “There’s huge gains to be made here and the good news is we’ve done it before as an industry, so let’s do it again at the edge.”

However, he advises that this works best when implemented in conjunction with other technology innovations. For example, “we have lithium-ion batteries that we’re starting to use in big data centers, we should start to think about using them in micro data centers; and then innovative cooling solutions like liquid cooling, could really start to help us solve this efficiency equation at the edge.”

As we see exponential growth within edge data centers, owners and operators will need to be more effective in managing their portfolio of micro data centers, as well as delivering more energy efficient individual facilities. Dave highlighted to me the importance of implementing cloud-based management tools such as Schneider’s EcoStruxure IT Expert to manage, support and maintain distributed physical IT assets throughout the entire lifecycle. Implementing a cloud-based, vendor agnostic solution enables real time monitoring and visibility.

Sustainability: Size Verses Power

Gartner reports that by year-end 2023, more than 50% of large enterprises will deploy at least six edge computing use cases deployed for IoT or immersive experiences, versus less than 1% in 2019.2

Edge data center facilities that are entirely owned by a company will place technology leaders in a better position to develop strategies for renewable energy and energy sourcing than companies with diversified ownership.

Gartner’s future analysis highlights the comparison between size and power that will exist within one portfolio, Dave explained the variance in size and power and the impact this will have on sustainability. “Imagine you had ten thousand micro data centers that were, let’s say 10 kW each, that’s 100 MWs, just that one group of micro data centers. That’s the kind of numbers that we’re used to talking about here. If we take the equivalent of the PUE as applied to large data centers, in this example if all the micro data centers comprised PUEs of between 1.6 to 1.2, it is comparable to taking 50,000 cars off the road.”

Dave remains optimistic about the future, acknowledging the experience that data center industry has acquired over time. “The good news is that we already know how to do this, we’ve already done it with big data centers, let’s apply some of that thinking to edge and then we can have these kinds of benefits at the edge.”

Interested in learning more about sustainable tactics for data centers? Read our brochure EcoStruxure™ Power Monitoring Expert, or download our white paper, Essential Guidance on DCIM for Edge Computing Infrastructure.

 

The post Energy Efficiency Requirements for Edge Computing appeared first on Schneider Electric Blog.

http://blog.schneider-electric.com/feed/