A Disruptive Manifesto in a Modest Proposal’s Clothing: Measuring Quality Delivered

October 11th, 2021 by · Leave a Comment

This Industry Viewpoint was authored by Magnus Olden, CTO at Domos

When Domos CTO Magnus Olden took the stage at the Broadband Forum a while back he knew his talk wasn’t going to sit well with many in the room – career telecom folks used to talking about bandwidth and broadband in common terms – namely the all encompassing Mbps measure. After all, and for going on decades, everyone from FTTH teams to the folks in finance have made that the very basis of their business cases for expanding networks, building out fiber infrastructure and upselling consumer connectivity tiers to consumers. In an industry where speaking truth to power can result in so many thrown tomatoes, gongs or worse, Olden’s gambit didn’t come without its risks.

Here’s his story.

__________________

By Magnus Olden

I try not to think about some of the more uncomfortable truths before me, but as a data person it’s quite impossible to do. Call them challenges, hurdles, whatever the word, there are some “fact”-ors that I need to dispense of right out of the gate. I am probably the youngest CTO in the telecom circles in which I travel. That is both a great honor, and a daunting detail. Telecom remains a “den of silver foxes”, and so it may be no surprise that I’ve often wished I was already in their generational cohort, if only to dispel any doubts they might have.

And so it was with these rather unhelpful thoughts in mind that I approached the podium on the 2nd, jam-packed day of the Broadband Forum in Amsterdam, the fall of 2019, just months before the great anti-virus quarantine. My presentation title? Machine Learning in the Connected Home – innocuous enough to raise castle gates, but something of a trojan horse. And what better place to launch a disruptive manifesto than within the sheep’s clothing of a Modest Proposal around AI? Still it bears underscoring that I am standing before an audience of telecom professionals with probably 10,000 years of experience, about to have their morning cornflakes spoiled, their kittens killed – choose your metaphor – by someone they assume has been in the business for about 5 minutes. Yikes…

After some general introductions, it was time to commence. Certainly, the AI part of the talk, the applied science part, couldn’t begin without a clear-eyed look at the very pure, predictable machine learning hypotheses fed into the algorithm. Simply put, to get to the AI part of the discussion, the assumptions going into it needed to be addressed. I am reminded of the quote, not sure whose; “there’s no such thing as Artificial Intelligence, but there is Real Stupidity”. We are right to question the inputs, lest we derive questionable outputs. Garbage in, garbage out….you get the idea.

So not a simple presentation of AI in the connected home afterall, but an honest look at the logical discipline it demands.

To get there, (baby steps), I echoed a sentiment heard in the halls of many in the WiFi standards circles I haunt. “Delivering a better internet experience is not about more bandwidth, but about better bandwidth”. Few would or could disagree with such a rosy, unqualified statement.

With me so far, they were, but now imagine that I am about to tell them that based on the findings – and not just mine – more bandwidth, as measured by Mbps, was not only not a good indicator of quality of experience, but often a very costly dial that when turned up, can actually increase latency.

Wait… What? Mbps doesn’t matter? You mean that Mbps, the very, fully commoditized measure of what we’ve been building out for years and selling to our customers and promising more and more of, not the measure of quality, but actually a cause of its suffering???

There are countless ways to convey the idea. Traffic jams when the lights go out, dipping one’s foot in a moving stream, etc. The Emperor’s New Clothes is a favorite go to – reflecting the fact in this case that Mbps, as a buyable speed level package, may not improve the experience for the end-user.

But I tried to keep it light, accessible, even humorous. My pizza delivery service analogy for example even elicited some chuckles. That’s the one where we apply today’s common gauge of the quality of a broadband service (theoretical Mbps) to grading the caliber of your local pizza delivery shop on the size of their truck. In crude terms, ”would you like one hot, fresh pizza, or a 100 cold ones?”

Add to the batting cage I was building for myself, I was about to tell them that in place of a relatively meaningless measurement of quality – or at least one that had fully jumped the shark some time back when “100Mbps” was delivered – there was no commonly accepted way to rank our pizza delivery service?

It’s hard to get off the Mbps pipe, but especially if there’s no alternative. Certainly this is not the message that the very deep pocketed buyers and sellers of fully commoditized bitrates want to hear, nor even the governments and politicians they advise, many of whom now weave the idea of the Gigabit Society into the very planks of their platforms.

But here in the trenches of innovation, we want real answers, and we’ve been spending a lot of time –  sometimes all of our time – trying to find them. Still, we understand that the “emperor has no clothes” moment, that we know is coming, won’t happen until someone at last hands him a more fashionable frock to clothe himself in.  To do that, we need to get the science – the weights and measures – of quality of experience just right, and demonstrate it. We can’t do that in an ivory tower. This is a mission that needs to bring all players together, and that, if anything, was the mission of the talk.

I don’t believe I disappointed. I did bring what I thought was a compelling case for applied AI in the Connected Home based on real, repeatable data inputs to help networks learn about and even predict application level needs against actual resources delivered. That is what we do, and it works. The primary driver? You guessed it. Once you have delivered a theoretical amount of capacity via Mbps, reliable latency becomes the only viable measure of true quality. What happens when you open up 3 new top speed lanes of onramp to a stretch of highway with bumper to bumper traffic? Not only no measurable improvement, but in fact a measurable degradation. That’s essentially what is happening when we dial up promised Mbps on a chronically congested network.

And what, after all, are we really measuring when we are talking about Quality of Experience? It’s the outcome of a given use case, not of the internet generally, but of a specific application. This implies a particular time frame and a certain, defined, single application-focused activity – playing a game, conducting a web conference call, streaming live video. How many stars do we give that experience at the end? We almost always get asked, and we all know it doesn’t really matter. One thing we do know is that it’s not about the theoretical Mbps speed tier package our CSP has promised, but it’s about the latency that session imposed on the given application we were using….or was it? More precisely, and to quote Netflix’s great dystopian time travel series Dark, “When was it?” It’s one thing to posit that latency is always the culprit, but if you can’t pinpoint when and in which segment of the network it occurred, or if it was caused by the peak or average (or anything in between), it’s just as useless as Mbps.

It may have felt like a bit of a copout to at once refute the accepted way of doing things without proposing a perfect alternative. The fact is that there are many, where there needs to be one. “One measure to rule them all…and in the outcome bind them”. And that one needs to be a hotly debated, cruelly criticized, fully examined, peer reviewed, and ultimately community accepted way of measuring not just quality of experience, but quality of experience delivered the Broadband Forum standard shortened QED.

The problem is complex, but its many facets were well expressed by some of the concluding statements from this past month’s meeting of standards stakeholders at the IEFT, and the working group tasked with Measuring Network Quality for End Users. A problem aptly compared by working group lead, Wes Hardaker from the USC’s Information Sciences Institute, to the challenges the US Environmental Protection Agency faced in creating the Air Quality Index – an intensely intricate combination of data from five sources including ground-level ozone, particulate matter, carbon monoxide, sulfur dioxide, and nitrogen dioxide. His comparison is bang on, and in no way over states the complexity of what we are working on relative to internet user experience, and of its comparably large societal importance.

On the last day of the three day set of round table sessions with many of the world’s leading minds in WiFi, all were asked to contribute a conclusion of the discussions and current state of affairs. While the members may scold me for sharing the contents of this particular petri dish, history may thank me even if they don’t.

Though we didn’t add it in, fearing that it might seem self-serving, we would add something to the effect of “Delivering consistently sufficient bandwidth at the application level will result in reliably efficient networks”.

Even if we don’t yet have the perfect way of measuring QED, our community is fully focused on the showstopping factors in play with latency and reliability. While the community works toward a consensus, we are secure in that knowledge that we can continue doing the work we are doing to make sure the 5th bullet point (paraphrased as “measuring how bad it is alone is frustrating”) in the list above won’t be the result.

It’s nice to know that we don’t have a solution without a problem.

If you haven't already, please take our Reader Survey! Just 3 questions to help us better understand who is reading Telecom Ramblings so we can serve you better!

Categories: FTTH · Industry Viewpoint · Low Latency

Discuss this Post


Leave a Comment

You may Log In to post a comment, or fill in the form to post anonymously.





  • Ramblings’ Jobs

    Post a Job - Just $99/30days
  • Event Calendar