Data Gravity Hero Banner
— Blog

Breaking Down the Barriers to Data Gravity

By Tony Bishop, SVP, Growth, Platform and Marketing, Digital Realty
8月 17, 2020

Our society is becoming data-intensive in almost everything we do – shopping online, interacting with customer service agents, or joining remote meetings via videoconferencing platforms. We’re creating a massive amount of data that must be processed, analyzed, and applied to keep our applications and businesses running smoothly. Driven by automation, mobile, and the internet of things (IoT), data is growing exponentially. In fact, it’s estimated that 463 exabytes of data will be created each day globally by 2025.

This drive towards a digital economy has already started to reshape how enterprises are creating and delivering value for their customers. As the data accrues and processing increases, it creates significant challenges for today’s IT infrastructure. The accumulation of data tends to also attract additional services and applications, creating data gravity, which describes an effect similar to what occurs with the gravity between objects like the earth and the moon. As data accumulates and more services and applications start to use it, the data begins to have a compounding effect on businesses, causing complexity and preventing digital transformation from occurring.

Data gravity is the single biggest challenge facing companies today. It adds complications to business processes and prevents companies from having the flexibility and agility needed to transform digitally. Businesses need to leverage the effects of data gravity instead of resisting them. To do this, businesses need a modernized infrastructure that can support the influx of data from several users, locations, clouds, and networks. Taking advantage of data gravity requires a network footprint that creates centers of data exchange. This allows traffic to be aggregated and maintained via public or private clouds, at the core or the edge, and from every point of business presence. A modernized infrastructure lessens data gravity barriers by bringing the applications, compute, users and things to the data.

Creating AI Centers of Excellence with NVIDIA DGX Systems to Accelerate Global Innovation

Technologies like artificial intelligence (AI) are exacerbating the negative impact that data gravity can have on a business’s IT infrastructure. The process of training AI models can be repetitive and cumbersome. If the AI compute infrastructure is not located near a business’s data sets, the training process becomes even more difficult and time-consuming. If the main dataset resides solely in the cloud, it too will quickly become cost-prohibitive given the costs of moving models to and from the data itself. As a result, the ability to do model training and inferencing in close proximity to where data sets lie, such as in a multi-tenant data center, is key.

While many data scientists are brilliant mathematicians who quickly develop algorithms for AI models, not all are necessarily trained in distributed IT and software architectures. For this reason, one developing industry trend we are experiencing is that of ‘model debt'. This is where data science teams quickly develop AI models, but they remain undeployed for months while the IT infrastructure and MLOps develop the processes needed to deploy and maintain the new models successfully.

The increasing demands of AI-powered applications and the growing popularity of capabilities like virtual and augmented reality require multi-tenant data centers that are built to support next-gen technologies and applications. To that end, we’ve teamed up with NVIDIA to make AI infrastructure available globally. In May, we launched a new enterprise IT solution called Data Hub. It features NVIDIA DGX POD to help enterprises solve global coverage, as well as capacity and connectivity needs by bringing powerful computing resources closer to customer data. In doing this, we're embracing data gravity principles. By offering an interconnected platform for enterprises powered by NVIDIA DGX Systems, we’re providing access to multi-tenant data centers that can support the cooling, power, and workload demands that AI requires.

We’re excited about the next milestone in our partnership to establish AI centers of excellence globally. Together with Core Scientific, who brings unique expertise in orchestrating the processes behind successful AI projects, we’re deploying the industry’s first Data Hub powered by the NVIDIA DGX A100 at our Interxion Digital Docklands Campus, located in the heart of London’s financial center. As the first data center provider to earn an NVIDIA DGX-Ready Data Center certification, Digital Realty’s data centers are set up to help enterprises tackle their most pertinent AI infrastructure challenges. With Core Scientific and NVIDIA DGX systems, data science teams will be empowered with a simplified and streamlined AI development workflow that speeds insights from data and supports overall enterprise transformation.

Building the Foundation for Our Digital Future

The new demands brought on by AI and machine learning create new opportunities for re-architected IT that supports businesses and their need to operate ubiquitously and on-demand. It's informed by real-time intelligence to power innovation and scale digital business. They must also be able to support the data exchange that fuels this application development.

Our collaboration with NVIDIA is built on PlatformDIGITAL™, which is the foundational layer we use to build the infrastructure of the future and solve all complex challenges facing today's enterprises.

PlatformDIGITAL™ enables businesses to re-architect towards a decentralized infrastructure, which brings users, networks, clouds, systems, and things together with the data to break through the barriers of data gravity. This solution model helps customers solve their global coverage, capacity, and ecosystem connectivity needs while adapting to the changing business landscape and fueling innovation for the future of business.

JOIN our Live Webinar - Accelerating Business Innovation and ROI with AI Platform-as-a-Service on Wednesday, August 19th at 7:00 am PST | 10:00 am EST | 3:00 pm BST - Sign up HERE.

Explore how IT leaders can enable AI innovation with a new AI Platform-as-a-Service that combines the simplicity and IT controls of the cloud with AI workflow tools and resources that accelerate insights from business data with:

Tony Paikeday – Senior Director of Product Marketing, Artificial Intelligence and Deep Learning, NVIDIA
Ian Ferreira - Chief Product Officer, Artificial Intelligence, Core Scientific
Wes Jensen - Director, Global Technology Business Development, Digital Realty

Architech イメージ03 2021 12 17 134537 Architech イメージ02 2021 12 17 134536 Architech イメージ01 2021 12 17 134535


Digital Realty クラウド認定ソルーション・アーキテクトが、あなたのスケーラブルな成長戦略を構築し、ビジネス変革をサポートします。