In a response to my exaflood post a regular commenter, Frank Coluccio, brought up a subject that I had skirted over but which deserves discussion. Specifically, he mentioned that the way in which we reach for 100G+ matters a great deal, and is currently in dispute. This is a difficult subject to talk about without descending into the netherworld of technical acronyms never to return, but I’m going to try anyway.
In order to keep up with bandwidth demand, providers will need to hook up bigger pipes – nobody disputes that. It wouldn’t make sense to stay with 10G as the biggest pipe when one has a 10Tbps network, one would have to add 100 new pipes to increase capacity 10% – totally insane and impossible to manage. So we *must* go to 100G and then on to larger pipes. The size of the basic unit must be reasonable compared to the size of the network.
But there are two ways to go about it. We could increase the density by putting put more data in each and every wavelength – this is the optical purists approach. Or we can take a bunch of 10G wavelengths, aggregate them, and package them so nobody can tell what we did – this is the practical engineer’s approach. In the next few posts, I’ll try to lay out the advantages, drawbacks, and implications of each, and then I’ll give my own WAG on who wins.
If you haven't already, please take our Reader Survey! Just 3 questions to help us better understand who is reading Telecom Ramblings so we can serve you better!Categories: Internet Backbones · Telecom Equipment