Today’s networks depend on software more than ever, and because of that we have seen the big networking vendors invest heavily in their software divisions. The rollout of 5G infrastructure underway, and along with the fiber and spectrum and hardware requires a whole lot of software to make it all work the way it has been promised. Nokia Software is one of the organizations behind much of that software. With us today to talk about Nokia Software’s approach to this brave new world and his viewpoint on what is to come is Nokia Software’s CTO, Ron Haberman.
TR: What is your background, and what was your journey to your present role as CTO of Nokia Software?
RH: I’ve been in the networking space for over 25 years both on the service provider side as well as on the vendor side. Before software and telecom, I was primarily involved in the routing space more on the IP side of the house with Alcatel Lucent. I later rejoined Alcatel-Lucent five years ago after selling my cloud gaming startup. Alcatel-Lucent was then acquired by Nokia, and my team became part of Nokia Software. I managed the CloudBand business unit for a couple of years. Then expanded to manage the Emerging Products group, which included cloud, analytics, IoT, cybersecurity and self-organizing networks (SON) — most of the new technologies that are making their way into telecom. When those “graduated” into becoming full businesses, they got distributed in the broader Nokia Software, and I was appointed CTO.
TR: What types of technology does Nokia Software focus on?
RH: We are organized in a product technology and engineering structure with three main areas of products that we’re developing. The first one is called apps, which includes primarily OSS and BSS orchestration systems, billing and charging, cybersecurity, analytics/customer experience, and SON. The second product area is core engineering, which is pretty much everything that has to do with the other side of your mobile handset. And the third is network management, where our focus is primarily on radio but is now getting expanded into other domains. And all of these products follow a multi-technology, multi-vendor approach. Every piece of software that we create is designed from the ground up to manage multiple domains: fixed, wireless, backhaul transport, etc.. For example, our SON solution is installed in environments where the radio is 100% one of our competition, and we still optimize that for the service provider. The same is true for our network management, OSS, cloud orchestration, cloud platforms, cybersecurity, assurance systems. Each product knows how to attach to care systems or core and edge elements from different providers, and creates a customer experience that cuts across all of those environments.
TR: So what have you been working on lately?
RH: We just recently announced our cloud-native assurance and experience software, and also our new element management system. Our focus, especially when it comes to things like management systems like assurance is to A) be multi-vendor and B) complete the transition to being completely cloud-native, moving to containers and a microservices type of architecture. We also have a lot of focus on lifecycle management capabilities, built-in telemetry, and the ability to run in pretty much any environment including public cloud. Those three areas (element management systems, assurance, and experience) complete how the modern operation center should be operated. As an example, when we talk about the Experience Center, it’s important to highlight that when a service provider is managing the network, it is not sufficient anymore just to see that routing is performing according to a particular KPI or that the uptime of a particular element is within spec. We must look at more broadly at all the different services that a particular subscriber registered for and understand the level of satisfaction that that particular individual or business will have. Our new Experience Center provides a very high level view of pockets of potential issues, and then you can drill down very easily to understand what is the satisfaction level of every individual customer.
TR: What are you working on currently that we should expect to hear more about this year?
RH: Slicing is going to become a very, very important part of mobile networks. We have almost 70 5G contracts, with almost 20 service providers that have enabled 5G already. And the road is just beginning because the key to 5G will be the new type of business that the service providers can create, which we believe will focus on B2B2C. It will be more enterprise-focused than consumer. For that, you need to be able to slice the network and create an offering that is tailored to the particular application needs of the enterprise. Up until now, every new ‘G’ focused primarily on creating more bandwidth. 5G obviously comes with more bandwidth too, but with the new dimension of latency combined with slicing. For a next-generation enterprise that is looking to make a factory completely wireless, a fulfillment center has better communication between the different elements of their business. For example, a harbor and the operations around the harbor, together with an aerial view managed by drones, requires specialized treatment from the network. The idea is that a service provider with the help of our orchestration system can let the enterprise IT leader tailor a network for that particular environment and then orchestrate an instance of it on demand for their needs. That is going to signal a huge transition from a fixed type of approach of today in which you create a network package and then you sell it for a while and then you create another one, etc., to one that is a very dynamic IP based management of such environments.
TR: How mature is the technology for such network slicing today? When will it be ready for prime time?
RH: We are still working with the service providers. I think that this year we will focus primarily on proofs of concept and early adopters of the technology. We have several proofs of concept ongoing, especially for the network orchestration piece. As an industry, we are adopting in parallel this more agile way of working, adopting methodologies such as design thinking. This is where we invest most of our time today, fine tuning what will actually make the biggest impact in how we orchestrate these capabilities into existence. And of course, on the broader side, the networks still need to be built, and radios with 5G capability need to be deployed more widely. So this will be an ongoing effort together with the service providers.
TR: In what other ways do you think 5G will change things, both for the consumer and the enterprise?
RH: On the consumer side 5G is going to be pretty much the natural ‘G’ replacement. Consumers are going to be offered new phones and other devices that will have 5G built in, and they will either replace their previous phone or add a device. But the enterprise is on a separate track which is going to be fairly new in multiple ways. One will be that if in the past you needed 100Mbps or 1Gbps, your only option was really a wired connection. But with 5G, the connectivity is now good enough that you have that option of being wireless, which means that it’s easier to reach that enterprise from the service provider. Another which will grow in parallel will be the slicing capability for private instances, where there are defined use-cases in mining or transportation or harbors, as I mentioned before, just waiting for the availability from the service providers. And then there is a set of areas such as cloud gaming that can benefit from low-latency connectivity but have not yet been deployed in scale. We expect more of those to be developed over time once the capabilities become available.
TR: Other than 5G, what other new technologies are you investing resources in?
RH: Machine learning is embedded now in pretty much every product that we make. SON, for example, benefits quite a bit from machine learning. The experience itself is powered by a machine learning stack that we’ve developed in order to understand the patterns and the correlation between the different areas of the business. For example, a consumer might have a particular experience at home that is different than the one en-route to work. When you correlate that with the billing experience and the care experience that you may have with that service provider, it can lead to a better understanding of how to serve you better, what new offerings you might be interested in, and the level of risk that might leave to that particular network. This is part of our Experience Center itself. As we move further and further into becoming fully cloud native, we’re also making a huge investment in making ourselves ready for public cloud consumption. That further expands our ability to use AI and machine learning technologies that are available from the different public clouds.
TR: AI can mean many things, from basic pattern recognition all the way to sci-fi characters with personality. Where do you think we are now on that spectrum?
RH: I think we idealize the term AI quite a lot, but we’re not close to the sci-fi version. What most still mean by AI is better machine learning. The neural networks get deeper and deeper to create a much more in-depth understanding of the different sets of data that we have. And in internet work one has quite a lot of data sets: location data, usage data, application data, the signals from the actual network, and many more. We are combining more and more of these data sources to create better insight for the service provider to both automate the creation of services, as well as make their subscribers happier. It’s going to be a while yet before the machines take over the world.
TR: In between those poles, however, at what point do we say machine learning is thinking?
RH: When we think of the future of AI, we do think of a cognitive aspect, which today is not there. Even if you take something super-advanced like the machine learning used by self-driving vehicles, there’s no cognitive element. The vehicle is basically reacting to a very broad set of data. That’s also how I see the self-driving network. We are adding more and more abilities by integrating it with more data sources and enabling it to process all this data and come to a better proposed next action. But from there to actually have a cognitive ability to say, “what a human would do”, I do think we’re still far from that.
TR: What’s the biggest challenge ahead for Nokia Software and others like you in today’s market? What keeps you up at night right now?
RH: The biggest thing generally from my perspective is the rate of adoption and deployment of 5G. Obviously, we would want to run pretty much as quickly as possible. In every new network technology, we were very good at going back to see what defined that generation. 2G was really the beginning of data, then 3G brought email and basic internet, and 4G brought the full internet and focused on video. With 5G, we have some theories but the key will be to create enough programmability in the network and to enable consumption of the network in a programmable way so that developers can really take advantage of it to create truly new types of applications that can benefit from low latency capabilities. That’s what I worry about the most.
TR: Who do you think it will be that writes the applications that can take advantage of all that programmability? Will it come from today’s big tech? From someone’s garage?
RH: I think it will be both. We have things like virtual reality that have already been identified, and we understand how latency can help. Physical security companies can automate surveillance with things like automatically flown drones. Those kinds of things will come from established, big tech companies. But if we learn anything from history, it’s that there is a completely new, unknown set of apps that will come from new developers. Someone will identify a need, and it’s now easier than ever to publish your app via environments such as public cloud providers like AWS for enterprise applications or the Google Play Store for a mobile application. In order to enable that type of innovation, one of the biggest capabilities in 5G is the network exposure function and the fact that these types of environments can become programmable. How that gets orchestrated into existence, and then how broad you can go in allowing this enablement to app developers will be the key to success.
TR: Does opening up so much of the infrastructure in a programmable way also introduces some new security risks?
RH: There are some security aspects, and it is important to highlight that. And Nokia has invested quite a bit in the security aspect. For IoT, which effectively opened up the attack surface by allowing very, very simple, usually unmanaged devices to be attached, Nokia can reinforce security with machine learning on the network side, understanding what these devices are doing and how they are doing it. For anomaly detection, we have invested in an area called SOAR, or Security Orchestration Analytics and Response, which, again, uses the behavior of the different elements in order to understand the intent, and any variation of that intent, of the different network elements. And for years we’ve been quite big in identity management, making sure that the use of the different elements of the network is based on known profiles with very strict policies on who is supposed to be doing what, and when. So, if somebody’s trying to take bad advantage of the programmability of the network, we do provide tools to prevent them from doing that.
TR: Thank you for talking with Telecom Ramblings!
If you haven't already, please take our Reader Survey! Just 3 questions to help us better understand who is reading Telecom Ramblings so we can serve you better!Categories: Industry Spotlight · Software · Telecom Equipment