The exponential growth in the amount of data being produced, processed, transmitted and stored today cannot be overstated. IDC estimates that worldwide data levels will reach over 175 zetabytes in 2025—a 61 percent compounded growth rate. We are also moving from an era where technology largely connects people to one where it increasingly connects machines. The most significant increase in traffic and data generation will come from machine to machine (M2M) communications.
With that much data being produced, it is no wonder that terms like “Data Tonnage” and “Data Gravity” are being used more frequently to describe its effect on data centers, networks and IT strategy. This is also impacting the physical space that houses these technologies with relevance to Enterprise IT professionals, Cloud providers and Colocation Operators.
Machine to Machine connections
In the near future, more than half of devices connected to IP networks will not be personal devices (smartphones, tablets, computers or TVs), but sensors, tracking modules, cameras and other forms of M2M connections. The amount of traffic from M2M applications, such as smart cars and industrial automation that require high bandwidth and low latency, is now growing faster than the number of connections. Furthermore, M2M requirements will necessitate not only edge computing but edge analytics to refine and reprogram sensors and applications.
While more and more data and IT workloads move to the Cloud, the industry is discovering that data migration (Data Tonnage) is not only time consuming but expensive. This unrelenting growth is putting increased pressure on already strained networks and data centers.
Data gravity (coined many years ago by David McRory) parallels the physical laws of gravity in that objects with more mass attract those with less mass. In this context of data and IT, large bodies of data contain more mass than applications and services. Accordingly, the greater mass will attract the smaller masses; however, since it is becoming increasingly impractical to have everyone close to the center, data and processing power are also moving to the edge. Edge and Cloud are not discrete entities; rather, they are a continuum. The edge can act somewhat like a traffic cop in that it can arrest certain requests for data and processing and thereby prevent massive flows of data to and from the Cloud.
Conversely, applications such as Artificial Intelligence and other applications involving big data analytics need to be contained in large data centers with the requisite storage and computational power. Of course, there are many applications in between which would necessitate a hybrid of these polar concepts.
Moving closer to your customers
Cloud providers are too moving closer to their customers. As technology capability and bandwidth increase so do user expectations. Hyperscalers like Google, Facebook, Microsoft and Amazon have responded by becoming among the largest investors in dark fibre and subsea cable routes to create redundant connectivity and to better manage the immense flow of traffic between their data center campuses. Some providers, like Facebook, also see this as a revenue opportunity and set up a subsidiary called Middle Mile Infrastructure to sell excess capacity to local and regional providers.
Impact on data center space
All of this has an impact on decisions regarding the physical structures that house all this technology. For enterprise-sized businesses, where to locate and continue to operate traditional data centers continues to be a key factor. While many IT workloads have moved to the Cloud, many feel we are still in the early days; however, moving workloads to the Cloud has not always yielded the outcomes expected and lead to some reversed migration (see our article on the same). How these forces net out is dependent on factors particular to each firm.
Edge computing also brings another dilemma for Enterprise. There are certainly compelling reasons to move computing power and data storage closer to the edge. In addition, some companies see more opportunities to use their “bricks and sticks” to strategic advantage. Verizon is using its retail network to deploy edge computing for its 5G network. Walmart is planning to use its retail footprint for its own edge strategy but also to resell capacity to businesses interested in that capability.
Cloud service providers have a seemingly insatiable appetite to build more data centers to handle the growing demands for computing power, data storage and to locate closer to their customers to address latency performance. There is a concern in the current environment with COVID-19 that some construction projects will be deferred (as discussed in our blog).
The need for data center physical buildings is not going away, however, the nature and structure of the space and who owns and operates it is. We see this trend continuing as cloud computing, edge computing and the need for proprietary space continues to evolve.
For more information on Cushman & Wakefield’s Global Data Center Advisory Group, contact us today.