Why the Edge Is Essential to Customer-Winning User Experience

November 20th, 2020 by · 2 Comments

This Industry Viewpoint was authored by Phillip Marangella, Chief Marketing Officer, EdgeConneX®

The importance of efficiently distributed network capacity and diversity of network access solutions will only increase as network traffic explodes in the coming years. IDC predicts the average person will have nearly 5,000 digital interactions per day by 2025, up from the 700 to 800 or so that people average today. As networks become strained, localized, proximate edge data centers can mitigate network bottlenecks, reduce latency and improve performance. These localized facilities also facilitate peering at the edge and act as local gateways for high-speed connections not only to the core, but also to other edges. In this way, the edge and the core can interoperate to allow businesses to optimize traffic flows and choose for themselves where data computing should occur, according to latency, cost and performance requirements.

Why is this critical? Because as network traffic reaches unprecedented levels, the quality of user experience increasingly depends on a rearchitected Internet, and for many innovative, next-gen applications, user experience is everything.

Low latency is key to a great user experience, but achieving it requires capacity within geographic proximity to end users. A data center physically closer to its customers’ end users, along with robust and reliable network connectivity, work together to deliver low latency. In turn, high-density power capabilities in the data center ensure companies are able to run the kinds of compute-intensive technologies that many modern applications demand.

Scalable data center infrastructure ensures companies have the compute, storage, power, and network capacity needed to deliver business and consumer applications, content and online services for users, no matter how fast demand may spike. Moreover, location matters. If an organization is only in the top data center markets, it’s not close enough to all of its customers to ensure the speed and performance expectations many end users now take for granted. Make no mistake, in a world where “slow is the new down,” that’s a tremendous competitive disadvantage. No one signing on to Zoom for a business conference or signing into Disney+ to watch a movie with their children wants to experience the packet loss or network jitter that will adversely affect end user QoS (quality of service).

Digital Transformation in Months, Not Years

The most successful digital innovators succeed by rethinking how to deliver the best experience for their users. It follows that as users have great experiences enabled by technology in one aspect of their lives, they come to expect similarly great experiences in all areas, and the companies that deliver those great experiences win.

Key to winning is the ability to have capacity wherever your end users are – whether that means partnering with an existing provider or building new infrastructure. By localizing data processing, storage, application workloads in closer proximity to the end user, edge computing facilities mitigate latency, circumventing the distance, capacity constraints, multiple network hops and centralized processing loads that exist in traditional Internet architecture.

Another powerful rationale for edge computing is to minimize network traffic going to and from the cloud, and it’s likely that this will deliver at least as much economic value as latency mitigation. Backhauling data from edge devices to a centralized cloud data center, also known as cloud offload, is already costly and causing bottlenecks, a state of affairs that is likely to worsen as data volumes increase.

The regional architecture of cloud service providers also means slower performance away from local nodes. Cloud gamers, for example, are notoriously demanding when it comes to latency. Leveraging the cloud for burst capacity can be a smart move for managing the launch-day demand of new gaming titles. However, when high demand persists, cloud is not always the most economical solution — not when latency requirements demand closer proximity to end users. In addition to a secure connection that occurs when bypassing the public Internet, cloud on-ramps deployed in edge data centers provide low-latency performance improvements, and even over a distance of just 200 miles, this can make a significant difference in the quality of end user experience.

These and other use cases — including autonomous vehicles, AI, augmented and virtual reality (AR/VR), and IoT and smart city applications — also call for an edge data center partner that has existing capacity and a proven track record of quickly bringing net new capacity online when and where it’s needed. While no one can predict the future, it’s reasonable to anticipate that as the global workforce becomes increasingly mobile, companies will continue to rely on video conferencing, cloud collaboration platforms and virtual private networks (VPNs), even as consumers expand their usage of e-commerce platforms and telehealth services. As Microsoft CEO Satya Nadella recently commented, “As Covid-19 impacts every aspect of our work and life, we have seen two years’ worth of digital transformation in two months.”  

These trends and technologies require reliable connectivity and capacity to deliver the quality of user experience necessary to make them viable, which in turn is dependent upon edge data centers that will deliver low latency and favorable economics wherever these services are needed.

 

If you haven't already, please take our Reader Survey! Just 3 questions to help us better understand who is reading Telecom Ramblings so we can serve you better!

Categories: Datacenter · Industry Viewpoint · Internet Traffic · Low Latency

Join the Discussion!

2 Comments So Far


  • whylonglvlt says:

    Lumen Technologies has the network to deliver ultra low latency of just 5 milliseconds one way to 98% of American! Lumen has the platform to do amazing things in the 4th industrial revolution, occurring right before our very eyes.

    • Kidwell1 says:

      But it takes 180+ days to get that network delivered and activated. And if something goes wrong, another 30-60. So when you average the time to deliver and repair, the latency goes way up.

Leave a Comment

You may Log In to post a comment, or fill in the form to post anonymously.





  • Ramblings’ Jobs

    Post a Job - Just $99/30days
  • Event Calendar