Select your host site wisely – it will be really hard to move as your data mass increases.
The term Data Gravity is gaining in momentum across the data center and wider tech sector. This year’s Pacific Telecommunications Council saw it frequently mentioned by leaders of the data communications industry. Coined nearly a decade ago by software engineer Dave McCrory it originally described the likelihood that services and applications would be attracted to data and that the larger the mass of data, the greater the ‘gravitational’ pull. Since then it has been used to narrowly explain the movement of enterprise data to the cloud, or more generally to look at the requirements of data analytics architectures. However, with regard to data centers I suggest there are three ways data gravity will matter.
Space to grow
The first rule of data gravity is that the more massive the data the harder it is to move. Amazon’s ‘Snowmobile’ service physically moves up to 100 petabytes of data to the firm’s cloud servers by truck, because it would take almost 3 years to move that much data over a 10Gbps connection. You may not have 100 petabytes in your data center now, but with the predicted impact of AI, IoT and 5G you may do soon. Colocation providers can add more racks and increase server density providing businesses with a more flexible growth strategy. But even they will need to add new data halls as demand grows.
As data storage increases, even allowing for the greater efficiencies of modern data centers, and the reduced cooling requirements in the Nordics, the demand on energy grids will increase. Many locations already have severely limited power availability and with pressure for more sustainable generation this will become a bigger problem.
It is important to select locations with the space and flexibility to grow to meet tomorrow’s data storage requirements.
Close by for current needs
McCrory’s initial definition was concerned with the latency and throughput between applications, services and data: these are also important in data centers. Many businesses are now hosting at least some of their data and applications in the cloud. However, whether they use public or private clouds they also have data and applications remaining on their own servers which need to exchange data with cloud-based applications. Plus, the high-performance compute resources needed for AI will demand high data throughput rates. The theory of Data Gravity suggests that both will migrate to be closer to data to mitigate latency and throughput issues.
In practical terms this means enterprises will increasingly move the last of their on-premise facilities into colocation facilities and look to locate new HPC and AI resources there to take advantage of fast on-ramps to cloud and storage servers where their data is held.
Close to customers and partners
Just as in medieval times when the market square had a gravitational pull for both merchants and customers so today the data center is becoming the foundation of a digital ecosystem. Proximity to markets, populations and potential partners is a key factor in success. Each data set will attract specialist applications and services that are permitted and able to leverage that data to create value. The greater the mass of data, the greater the number and variety of applications and services attracted. Data centers have always played the role of hubs for connectivity, power, cooling and other utilities, now they are fostering opportunities, markets and collaboration around data itself.
Location is fundamental to the success of any vibrant marketplace. It is essential to be close to customers, accessible for those that work there and convenient for those that wish to collaborate. In the emerging AI and IoT economy this increasingly means being close to the users and devices that are generating so much of the predicted tsunami of data that organisations will want to analyse and act upon. Not all applications need ultra-low latency, but for many, the shorter the distance between user and data the better.
It is also important to locate in areas that can attract the skilled workforce required to deliver the services market members require. It’s all very well putting your data north of the Arctic Circle or on a remote coastline but encouraging staff and partners to work there is another matter. Remote marketplaces frequently fail to gather the critical mass of customers and suppliers needed to survive.
Location, location, location
The laws of data gravity reiterate the old real-estate saying – it’s about location, location, location. Select your host site wisely – it will be really hard to move as your data mass increases.
Locate your cloud and enterprise data, apps and services close together to maximise throughput and minimise latency.
And finally, chose a location where your data gravity can attract the best and widest variety of customers, partners, services and supplies to create a vibrant digital marketplace in and around the data center.
Article written by Joachim Kauppi – International Sales Director
Read our other insightful industry Blogs.