AI ready data center hero
— Blog

What makes a data center AI-ready?

Chelsea Robinson, Product Marketing Manager
7月 11, 2019

The artificial intelligence (AI) revolution is upon us. Organizations across all industries are adopting AI to meet business challenges that include increasing efficiency, performing in-depth predictive analysis, and improving customer experience. As companies prepare their AI strategies, the choice of IT and data center partners—including their colocation (colo) solution—becomes critical.

However, not every colocation facility is prepared to support AI. AI requirements place different demands on data centers than traditional workloads do. As higher levels of computational power are needed for new AI applications, electricity usage and heat generation also increase. High-density workloads require specialized power and cooling needs.

Advancements in technology raises the standard of what constitutes “high-density,” so over time we can expect this to increase. In 1961, the German physicist Rolf Landauer developed a theoretical proof, now known as Landauer’s Principle, demonstrating that there’s an upper limit of how many computations can be processed for each kilowatt-hour. At a fundamental level, computers must operate within the laws of physics and additional computing power causes higher energy use and produces more heat. What makes a colo environment “AI-ready” is its ability to support high-density workloads—and having the advanced cooling capacity to keep them stable and running. When assessed against these criteria, most colo facilities are not up to the task.

The Right Support for AI

Many of the specifics that make colo facilities AI-ready were traditionally restricted by physical limitations in their power density and cooling capabilities; however, user experience and guaranteed uptime are also essential pieces of successful data center solutions. As you consider a colocation service for your AI applications, evaluate your prospects against the following criteria to help facilitate a successful deployment.

Processing Power

Artificial intelligence requires a massive amount of processing power. If it weren’t for innovative technology approaches to processing solutions, businesses would risk long computation time that could translate to poor user experience and lost opportunity. The advent of GPUs (graphics processing units) is offering breakthrough performance for training complex AI models in a fraction of the time taken by other platforms. Modern AI-ready facilities are utilizing GPUs to accelerate applications and plan for exponential processing opportunities in the future.

For example, NVIDIA’s DGX-1 servers with GPUs can learn 140 times faster than CPU-only servers. Deep learning training that would take a CPU-only server 711 hours can be completed in just over 5 hours with the DGX-1. This equates to performance over 1 petaflop, while NVIDIA’s DGX-2 product provides over 2 petaflops. These performance levels translate directly into increased opportunity for IT teams and better contribution towards business objectives.

Power Management

Energy efficiency in data centers takes on a new importance as AI proliferates. Machine learning applications require a lot of training data and sophisticated algorithms in order to get satisfactory results—and as density increases, power needs increase dramatically.

AI workloads need considerably more energy than the 7 kW per rack that is considered an average target for many data centers. It’s not unheard of for an AI application to use more than 30 kW per rack, so per-rack power demands can easily exceed what standard data centers can deliver. Add to that the need for redundant power to minimize downtime, and it’s clear that AI applications need continuous, reliable energy—a lot of it—which can quickly drive up expenses. It’s vital to partner with a colo provider that has the specific expertise to control costs and efficiently manage power use.

Cooling Requirements

As per-rack power demands rise, so does the need for highly-effective cooling. According to Gartner, more than 30 percent of data centers will no longer be economical to operate by 2020 because of AI-driven increases in workload densities. If your colocation environment isn’t prepared to support cooling capabilities to achieve desired thresholds of AI applications, then your infrastructure most likely won’t be prepared to support your future computing.

Growing workloads require more resources to sustain the cooler temperatures necessary to keep servers running. Fan cooling, which becomes more difficult at 16 kW and above, is insufficient for many high-density applications; higher power consumption indicates a need for alternative cooling methods to prevent equipment failure and ensure efficient operations.

Data Center Frontier reports that the growth of liquid cooling is a gradual (but notable) trend in the data center industry. Liquid cooling typically uses water; how these cooling systems work differs depending on the exact system. Some solutions use direct-to-chip liquid cooling, while others use water to cool the air with a heat exchanger. Whatever the method, liquid cooling has significant advantages over fan cooling—in some instances reducing power usage by 20 percent (from 1.5-2.0 PUE to under 1.1).

Overall water use in data centers varies, and adding liquid cooling systems increases water consumption. With an air-cooled system, water use may be 8 to 16 times lower than it would be in a data center that uses liquid cooling. Colo providers can take practical steps to address the environmental impact by reducing their use of potable water and instead relying on reclaimed water for cooling, making them both AI-ready and green. Efficient and sustainable cooling protects your colocation investment while protecting the environment.

User Experience (UX)

In addition to the technical requirements of AI-ready data centers, user experience (UX) is at the core of smoothly-running AI applications. Unexpected downtime and a lack of on-the-ground support can have a significant negative impact on UX.

Gartner calculated in 2014 that businesses could lose well over $300K on average in just an hour of downtime—a figure that has only increased over the last five years. A highly reliable colocation partner will deliver at least five nines (99.999%) of uptime—less than six minutes per year of downtime.

Unmatched Equipment, Resiliency and Support

We live in demanding times. Customers demand the consistent experience, high performance, and outstanding service that only AI-based technology can deliver. AI demands state-of-the-art facilities in order to support your business goals. It’s imperative that you select the correct colocation partner to help you drive your business to the next level.

Digital Realty is a leader powering the digital ambitions of organizations around the world. Earlier this year, Digital Realty was announced as an NVIDIA DGX-Ready Data Center, demonstrating a top-notch AI-readiness that can help organizations realize their goals.

Are you ready to step into the future? Find out more about Digital Realty’s colocation services.

Architech イメージ03 2021 12 17 134537 Architech イメージ02 2021 12 17 134536 Architech イメージ01 2021 12 17 134535


Digital Realty クラウド認定ソルーション・アーキテクトが、あなたのスケーラブルな成長戦略を構築し、ビジネス変革をサポートします。