This Industry Viewpoint was contributed by Scott Askins, VP of Channels and Strategic Alliances at Netrality Properties
It’s no secret that cloud adoption has risen astronomically over the past few years. Thanks to the even more recent explosion of public cloud services, hybrid cloud has replaced private clouds as the most dominant force in enterprise computing. According to McAfee, more than half of the enterprises they surveyed in 2017 had adopted hybrid cloud, and they forecastedthat 80% of all IT budgets would be allocated to cloud services by 2019.
The amount of data now being generated by enterprises across a wide array of connected devices is staggering, and growing exponentially. The Internet of Things alone will generate 507.5 zettabytes of data by 2019 across more than 20 billion connected objects. The volume and distributed nature of this data necessitates the hybrid cloud strategy we are now seeing most enterprises adopt.
However, to keep pace with the data deluge and support a modern hybrid cloud setup, enterprises must embrace a more robust edge strategy or risk significant performance issues. Here’s why - and how interconnected data centers stand to power the edge of the future:
The cloud is not enough
Every day there are more connected devices transmitting data from the field, fueling incredible demand for real-time processing and analytics. This expansion is being enabled by unprecedented access to the hardware and software that power today’s analytics capabilities. A report by Goldman Sachs shows that over the past 10 years:
- The cost of IoT sensors has halved
- The cost of bandwidth has reduced by 40x
- The cost of processing has reduced by 60x
Our ability to scale the numbers of sensors being used and the subsequent data generated and collected have exploded, which has led to commensurate demand for complex, real-time processing for business analytics, artificial intelligence and other business-critical applications.
The scale is such that traditional cloud cannot support it. According to Gartner’s Thomas Bittman, “The agility of cloud computing is great – but it simply isn’t enough… As people need to interact with their digitally-assisted realities in real time, waiting on a data center miles (or many miles) away isn’t going to work.”
The rise of, and need for, edge computing is driven by the growth of IoT, as well as mobile devices and massive content delivery networks.
The edge on the rise
Whereas the shift to cloud computing is quite mature, edge computing still has quite a bit of growth ahead of it. By 2020, the global edge computing market is expected to reach $6.72B, up from $1.47B in 2017.
The edge computing market may always remain smaller than its cloud counterpart, thanks to its narrower range of use cases where low latency is especially critical. These uses will be central to the future of technology, however, as self-driving cars, machine learning/AI, virtual and augmented reality and the aforementioned IoT, among others, rise in prominence.
This makes the architecture of the edge extremely important. There are a number of options for bringing the cloud closer to consumers, including through public cloud providers, but physical, interconnected and locally positioned data centers provide the best way forward.
Interconnected data centers power the edge on location
Across an IT landscape marked by enterprises managing a varied collection of infrastructure configurations, including on-premises, hosted, and public and private cloud solutions, it’s becoming increasingly important for IT to focus on providing business-enabling services and outcomes, rather than on physical infrastructure. Interconnected data centers located in key geographic areas offer direct access to multiple disparate carrier/mobile networks and direct cloud onramps, supplementing existing IT delivery infrastructure options such as on-premises data centers, private clouds or public clouds.
Data-rich and analytics-driven enterprises need the cloud to come closer to where their consumers are located. So edge data centers need interconnection – immediate proximity to multiple bandwidth options from multiple long haul and metro carriers – to meet the real-time processing demands of edge computing. For enterprises running applications that require low latency, interconnected edge data centers/colocation facilities located in close proximity to their end users deliver the processing capabilities these enterprises need to remain competitive.
However, not all colocation facilities are the same: It is important to partner with a completely carrier-neutral, connectivity and bandwidth-neutral, colocation provider. Instead of looking for a colocation offering that is bundled with managed services, enterprises should look for a providers that partners with, rather than competes against, CDNs, CSPs, MSPs and Cloud Providers.
Seeking out these interconnected edge data centers in key geographic regions is the best way to bring the edge as close as possible to end users.
If you haven't already, please take our Reader Survey! Just 3 questions to help us better understand who is reading Telecom Ramblings so we can serve you better!Categories: Datacenter · Industry Viewpoint