This Industry Viewpoint was authored by Stefan Bernbo, CEO and founder of Compuverde.
The Internet of Things (IoT), an ever-expanding web of connections generating unprecedented volumes of data, holds promise as well as challenges for the telecom industry. The Open Interconnect Consortium was recently formed by a group of technology vendors to define connectivity requirements that ensure the interoperability of the more than 30 billion devices that research group IDC projects will come online by 2020. It’s a blistering pace. Analyst firm Ovum predicts that M2M connections will grow at a rate of 23 percent per year over the next few years.
The IoT’s vast quantities of data will facilitate improvements across all facets of life and business, if it can be properly stored and analyzed. However, current storage approaches are proving insufficient already. Telecoms will need to consider new approaches to data center architecture if they hope to remain profitable while continuing to provide the levels of service their customers demand.
Appliances – the server hardware boxes that come with proprietary, mandatory software – make up the bulk of today’s data center architecture. The software is designed for the hardware (and vice versa), and come bundled together as a package. The benefits of this configuration include convenience and ease-of-use.
These appliances usually come with built-in redundant copies of costly components designed to anticipate and prevent failure caused by reliance on a single point of entry. These extra components bring with them higher hardware costs, greater energy usage and additional layers of complexity. When companies begin planning for infrastructure that can accommodate growth events like the IoT, costs for this traditional architecture skyrockets.
With hardware as the data center backbone, requests come in via a single point of entry and are then re-routed – another difficulty regarding traditional appliances. Think about a million users connected to that one entry point at the same time. That’s a recipe for a bottleneck, which prevents service providers from being able to scale to meet the capacity needed to support the Internet of Things.
Reducing Costs, Increasing Flexibility
Scaling to IoT needs is not an option, but an essential. Therefore, telecoms must consider storage alternatives to the status quo. One such option is software-defined storage. By taking features typically found in hardware and moving them to the software layer, a software-defined approach to data center architecture eliminates dependency on server “appliances” with software hard-wired into the system. This option provides the scalability and speed that the IoT demands.
Everyday devices have been “software-defined” for years now; this is not just another example of a buzz phrase that can’t live up to its hype. The PC is a prime example: software can be installed on any hardware platform, allowing the user to custom-tailor both the hardware and the software according to his or her needs. The average PC can use Linux as an operating system if the owner so chooses. This gives the user greater freedom to allocate his or her budget precisely as needed for the task at hand.
Because software-defined storage liberates the software from the hardware, allowing administrators to choose inexpensive commodity servers, software-defined storage provides a cost-reducing alternative to traditional appliances. When coupled with lightweight, efficient software solutions, the use of commodity servers can result in substantial cost savings for online service providers seeking ways to accommodate their users’ growing demand for storage.
Since data centers have differing needs, scalability is an issue, and one that software-defined storage addresses well. A telco servicing one particular area will have different storage needs than a major bank with branches in several countries, and a cloud services host provider will have different needs still. While appliances might be good enough for most of these needs, fully uncoupling the software from the hardware can extract substantial gains in economy of scale.
Administrators are freed from the constraints of the mandatory software that comes with traditional appliances if they choose software-defined storage. They can hand-pick the components and software that best support their goals. While this approach does require more technically trained staff, the flexibility afforded by software-defined storage delivers a simpler, stronger and more tailored data center for the company’s needs. Furthermore, a software-defined approach uses a horizontal architecture that streamlines and redistributes data, which eliminates the potential bottlenecking problems of vertical, single-entry-point models. Data is handled faster and more efficiently, and this non-hierarchical construction can be scaled out easily and cost-effectively.
Spreading the Storage Layers
The horizontal architecture approach will meet the demands of the IoT well. With millions of devices needing to access storage, the current storage model that uses a single point of entry cannot scale to meet the demand. To accommodate the ballooning ecosystem of storage-connected devices all over the world, telcos need to be able to spread their storage layers over multiple data centers in different locations worldwide. It’s becoming increasingly clear that one data center is not enough to meet the storage needs of the Internet of Things: storage must instead to be distributed in a way that lets it be run in several data centers globally.
Storage for the Interconnected World
As billions of connected devices churn out petabytes of data each day, the IoT will continue to challenge current data center capacities. Alternative storage approaches must be found if telecoms hope to scale to meet their growth goals within budget restraints. Software-defined storage offers a horizontal architecture that alleviates the bottlenecks that could create serious service issues. It also enables flexibility, speed and scalability at a reduced cost. Such a scalable and cost-effective solution will stand providers in good stead as they distribute their IoT data across the globe.
About the Author:
Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, Stefan has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.
If you haven't already, please take our Reader Survey! Just 3 questions to help us better understand who is reading Telecom Ramblings so we can serve you better!Categories: Datacenter · Industry Viewpoint · Software