My Data Center Predictions for 2020: The Year We Set the Stage for Futuristic Autonomous and Immersive Applications

Credit to Author: Steven Carlini| Date: Wed, 08 Jan 2020 16:18:00 +0000

For the New Year, noteworthy trends will dominate resources in the data center market and impact the industry, setting the stage for futuristic autonomous and immersive applications. For my 2020 predictions, I am focusing on the overlap of Cloud and Telco (5G) at the Local Edge, infrastructure requirements, energy demands for the Edge architecture build, increased cooling solutions for AI learning applications, and specific-built data center applications for ASiC.

Cloud at the Local Edge

local edgeCloud computing has been a bit of an enigma over the years but now, driven mainly by the smart phone era, almost everyone understands that their pictures live in a large cloud data center somewhere. There are certain geographical “sweet spots” where these very large data centers cluster around the world. These sweet spots include low cost or renewable power, access to IT workforce, tax incentives, and lately, a climate that is on the cold side and conducive to “free cooling.” Unless you get lucky and live in close proximity to one of these hyperscale data centers, you most likely have experienced slower than expected performance for cloud services like Microsoft 365, Salesforce, Box, etc. Additionally, if you have moved your business-critical operations to the cloud and you are not satisfied with the performance or availability of those applications, you probably have heard that the solution could be an on-premise (your site) “cloud stack” or “tethered cloud.”

There has been a lot of speculation and promises around extending the data center cloud on-premise to the “local edge.” Smaller versions of these hyperscale data centers could increase speed, lower costs, and allow businesses to keep data within their four walls, giving them greater control over that information. The idea is to allow businesses to run similar IT infrastructure on-premise to improve the consistency of their hybrid cloud system. The ultimate goal is to enable businesses to actually use the same tools, APIs, hardware, and functionality across both their on-premise and cloud systems to create a consistent hybrid user experience.

While Azure stack has been available for a short time, it has not exactly been affordable. The Azure solution has more cost-effective versions coming. Amazon Outpost was officially made available at the end of 2019, and Google Anthos will be right on its heels. 2020 will be more of a year of validation and site testing, but it’s a move in the right direction.

Next Gen Telco (5G) and IT Data Center Architecture Converge

Today’s connected society and proliferation of high-bandwidth technology – video, immersion (AR/VR), haptic, etc., – require local data architecture to deliver on the promises. Yes, even 5G must deal with physics and the speed of light, and it will be deployed in small clusters with each one requiring a mobile edge computing (MEC) data center. Think about a quantity of four new MEC data centers in the same area of a single 4G tower and base station. It’s going to be a tremendous challenge first to justify the ROI, then to find locations for these MEC data centers (tops of buildings, parking garages, basements) where the cost to deploy is going to be quite expensive and access rights to these locations is questionable. At the same time, cloud providers (Amazon, Google, Microsoft) are releasing their intention to do on-premise local cloud stacks or tethered clouds at the local edge – a very similar deployment strategy. Additionally, the application needs and drivers of the cloud going to the edge, namely latency, overlap significantly with the promises that 5G makes for mobile users (public and private) and make MEC data centers a natural and likely meeting point of cloud services and the mobile world.

Is your infrastructure future-ready? Download our report to find out

Finally, 5G is a software designed technology and the intention is to operate on standard IT servers so it is quite possible and likely we will see MEC data centers providing on-premise cloud services, as well as running the 5G application.

Energy Costs at the Edge can be Massive

I am predicting that next gen telco and tethered clouds will drive the massive build-out at the edge. If we use the local MEC needed for 5G as a proxy, we can calculate the impact on energy and carbon. In terms of scale, significant global coverage in 2/3/4G is in place with about 5 million telco tower base stations in the world with average power draw at about 6 kilowatts (kW) rising to 8-10kW at peak traffic periods. The global footprint is 50GW at peak power! Unfortunately, most of these tower base stations were not conceived with energy efficiency in mind. They operate around a PUE of 1.5 (power in/power of the telco (IT) load), meaning that about half of the power is wasted. When deployed at scale, this power adds up quickly and waste is multiplied by the number of deployments.

Here’s a real example of an initial 5G buildout: a Chinese operator recently added 100,000 5G-ready base station sites averaging 10kW each – that’s 1 GW of energy! At a PUE of 1.5, this could cost 1.3 B€ ($1.45B) and give off 9.3 million tons of CO2 annually (based on U.S. national average CO2 footprint). But if these systems were designed to be extremely energy efficient, PUE could be 1.1, and it would cost 1B€ ($1.12B) and give off 6.8 million tons of CO2 annually.

Energy costs and controlling carbon are critical considerations for this generation of telcom. I predict a focus on standardization and innovative solutions like liquid cooling and cloud-based management systems will be leveraged.

Drop-in Computing Racks for AI Learning

Artificial Intelligence (AI) awareness has gone mainstream, mainly from what people view in futuristic movies. AI is chiefly seen as robots that have progressed to be able to learn on their own. We are far away from that reality, but we have begun a path where we want our computers to start learning things on their own by using a neural network model, which mimics the way the human brain works. For example, healthcare applications are reducing the time it takes to plan treatments or diagnose ailments with the use of machines. Once the machines have developed the algorithm and the inputs needed, much less powerful computers can be used for inference AI. The issue is that learning AI, or training AI computers, use the most powerful processors (CPU and GPU) and could produce 20, 30, or up to 50 kW of heat. These heat levels are inefficient and, in most cases, impossible to cool with any form of popular mainstream cooling. For 2020, I see a trend where data centers of all kinds – cloud, colocation, and enterprise – will need to deploy dedicated drop-in computing racks for AI learning. The best way to cool these is some form of liquid cooling – immersion or direct to chip. These drop-in racks will not need air containment or to be lined up in rows or pods. Again, they will be limited in number and once they have developed the appropriate model, they will be dormant until a new model is needed.

Data Center at Scale for ASiC

An application-specific integrated circuit (ASiC) is an integrated circuit (IC) customized for a particular use, rather than intended for general purpose. AI as well as digital ledger type applications (blockchain) operate far faster and much more efficiently with ASiCs vs. general purpose chips. We will see companies that are not considered mainstream GPU and CPU developers enter the market with data center scale application specific chip sets.

2020 is an exciting time to set the stage for futuristic autonomous and immersive applications. Computing power and storage will be packaged more effectively and operate more efficiently in the form of edge computing and ASiCs. While the industry is motivated to move fast, the stakes are high as the deployment scale, energy use, and carbon footprint are top of mind issues.

The post My Data Center Predictions for 2020: The Year We Set the Stage for Futuristic Autonomous and Immersive Applications appeared first on Schneider Electric Blog.

http://blog.schneider-electric.com/feed/