Quantcast
Channel: Data Center Frontier
Viewing all articles
Browse latest Browse all 162

What to Look for in an AI-ready Colocation Provider

$
0
0

Chelsea Robinson, Product Marketing Manager, Digital Realty, highlights what makes a colocation provider AI-ready, and how today’s enterprises can spot progressive colo companies. 

Artificial intelligence (AI) has become ubiquitous. What once started as a dystopian, futuristic, robots-take-over-the-world notion is now becoming a universally accepted strategy for running businesses. Seemingly, almost all industries are adopting some form of AI to stay ahead of competition, simplify strategies for business challenges, and improve customer experience. 

Colocation Provider

Chelsea Robinson, Product Marketing Manager, Digital Realty

But how exactly do companies implement AI into their operations? It comes down to their IT environment. Not every colocation facility is properly equipped to support AI workloads. 

This presents the unique opportunity to explore what truly qualifies a data center to support artificial intelligence: the ability to support high-density workloads and power requirements, as well as the ability to provide advanced cooling technology to keep those workloads stable and running. 

High Density Workloads

Landauer’s Principle is a theoretical proof that was developed in 1961 that demonstrates an upper limit of how many computations can be processed for each kilowatt-hour. At a fundamental level, computers must operate within the laws of physics and additional computing power causes higher energy use and produces more heat. Where does this leave organizations that desire to maximize or optimize their computing power?

Power Innovation

Seven kW per rack is considered an average target for many data centers and colocation providers but AI workloads require  much more processing power. GPUs (graphics processing units) are able to perform complex mathematical algorithms much more quickly and efficiently than a regular CPU (central processing unit). One example of this is  NVIDIA’s DGX-1 servers, where the GPU technology can consume and process 140 times faster than CPU-only servers. This would take a CPU-only server over 711 hours. With this type of deep learning it only takes 5 hours with the DGX-1. This presents businesses with an opportunity to improve their data processing and business performance in a fraction of the time. Think about what a business could do with 1-2 petaflops of processing power and how that might help an organization accomplish their business objectives faster. 

Power Up

With the rise of artificial intelligence and machine learning, energy efficiency takes on new importance for data centers. The volume of data that machine learning applications require to process the high-density loads (think sophisticated algorithms, predictive modelling and more) increases power needs dramatically. 

When energy-demanding artificial intelligence applications are known to use more than 30 kW per rack, power demands regularly exceed standard data center power standards. Data centers and colocation providers also need to ensure that they have redundant power strategies to minimize downtime. 

With higher density workloads comes more power, which translates to more heat. Not all data centers are built to support the demands placed on these increases in power and cooling requirements since higher power consumption can translate to a need for alternative cooling methods beyond fan cooling. Gartner predicts that more than 30%  of data centers that fail to prepare for AI will no longer be economical to operate by 2020. If proper cooling capabilities are not in place, your IT infrastructure will fail to operate and will negatively impact your business. 

At the core of running seamless artificial intelligence applications is the user experience (UX). 

One popular method of cooling, especially for data centers with AI workloads, is liquid cooling. This method uses water to cool the environment. Some solutions use direct-to-chip liquid cooling, while others use water to cool the air with a heat exchanger. Whatever the method, liquid cooling has significant advantages over fan cooling—in some instances reducing power usage by 20%  (from 1.5-2.0 PUE to under 1.1). 

While liquid cooling is effective, it does increase water consumption. Another method is an air-cooled system. For data centers that use this methodology, water use may be lower than its liquid cooled cousin. It’s critical that colo providers address the environmental impact by relying on reclaimed water for cooling instead of potable water, making them both AI-ready and green. This enables the colo customer to protect both their investment and the environment. 

Easy User Interface

At the core of running seamless artificial intelligence applications is the user experience (UX). 

If your colo partner isn’t guaranteeing at least five nines (99.999%) of uptime —that’s less than six minutes per year of downtime—then you might not have a highly reliable partner. Reliability is critical for any organization. In fact, Gartner calculated in 2014 that businesses could lose well over $300K on average in just an hour of downtime—a figure that has only increased over the last five years. 

Exceptional User Interface

Customers should expect from their colocation provider excellent service and undisrupted data transfer in a secure environment, especially when speaking to AI applications. Choosing the right data center provider is a strategic decision that is so critical to achieving your business objectives.

Chelsea Robinson is the Product Marketing Manager at Digital Realty. 


Viewing all articles
Browse latest Browse all 162

Trending Articles