100 gig networks are on the way. The Department of Energy (DoE) has just awarded $62 million to build one.
Just like any other network, visibility into the network, and the ability to monitor and troubleshoot it must also be taken into consideration. Even with 1 gig and 10 gig networks, special hardware and software is often needed to capture and analyze all of the traffic. And even that is not sufficient when these networks are fully saturated, or experience large spikes, small packets, and other anomalies.
In some ways, the network monitoring industry is still working to catch up with 10 gig networks, so yes, developing new technologies and tools for 100 gig is going to be expensive and not ready for prime-time for a good while. But this forthcoming innovation is good for everybody downstream as it will push the envelope, and drive the next generation of networking tools and corporate revenues.
The types and number of issues surrounding the development and deployment of a 100 gig Ethernet network will depend on how deep into the network the 100 gig needs to go. Currently, there are a few options for 100 gig core routers, but beyond that available commercial hardware stops at around 10 gigs. A quick search on network cards greater than 10 gig came up empty. And even if you found the cards, current twisted pair cable only goes up to 20 gigs. To go higher than 20 gigs means re-cabling with fiber.
If the 100 gig network is just a big pipe between the major carriers, and everything in between is 1 gig and less, then the scope of the problem is pretty well defined. The cost then is a matter of rolling out 100 gig fiber if necessary, but maybe not if multiple existing smaller capacity lines can be aggregated.
If the project is more ambitious, and is attempting to go end to end, then there are a lot of problems and expense right out of the shoot. Namely, everything has to be replaced in between the core router and the PC’s. Even the network card in the PC has to be upgraded to 100 gig, which has not been invented yet.
Finally, this still leaves the challenge of monitoring a 100 gig network. Currently, there is no single network analyzer that can capture at 100 gigs. One way to achieve this is with a series of load balancing taps that break the traffic down into smaller 10 gig lines, which then feed into separate analyzers working in parallel. Interesting idea, but I don’t think anyone has invented it yet.
Perhaps we need some town hall meetings to discuss?