pointer

Tag Archives: 10G network monitoring

Survey Says: We Need More Network Forensics and Real-Time Visibility at 10G!

We launched our 2013 State of Faster Networks survey back in July of this year with the goal of determining exactly how widespread companies’ migration to 10G is and what kinds of problems are being seen along the way.

Over the past few months we compiled your responses and are excited to share them with you. We thank all of you who participated in helping us better understand the footprint of 10G, and we hope the below results help shed some light on how you can fully utilize, and transition to, faster networks.

Who Took the Survey?
We received responses from more than 100 network engineers and IT directors, and as you will see below, the majority of respondents are already supporting 10G networks.

Key Findings
Our survey questions addressed two central topics in relation to faster speed networks: network analysis and network forensics.

The survey found that while many companies support 10G networks (72 percent), network engineers are facing a myriad of challenges in migrating to 10G. For example, 43 percent of respondents cited limited or no network visibility as the biggest challenge in transitioning to faster speed networks. Other challenges respondents cited include:

  • Limitation of security monitoring (21 percent)
  • Lack of real-time network analysis (20 percent)
  • Lack of smart tap compatibility with an existing solution (16 percent)

When it came to network forensics, we found only 31 percent of respondents are implementing some type of network forensics solution in their network, even though 85 percent stated it was essential. What is the reason for such a large discrepancy? Cost is a huge inhibiting factor, with nearly half of respondents identifying cost as the biggest factor in enabling network forensics on high-speed networks.

The below graph details additional challenges with network forensics at 10G: netforensics_graph As we stated before, network forensics is the only solution at 10G. It allows you to quickly solve everything from intermittent network failures to major security risks. Attempting to recreate a network problem is like trying to find a needle in a haystack at 10G! Out of all of our statistics, we were shocked to learn this practice wasn’t more pervasive.

What Does it all Mean?
Now that we’ve seen the results and determined what companies need in order to make the switch to faster networks, what are the next steps? Survey respondents made it clear they need more real-time statistics and faster forensic search times in order to migrate to 10G, yet several factors are stepping in their way.

WildPackets offers many different network analysis solutions at 10G that fit any budget and provide the real-time insight needed. Check out our whitepaper, “Network Forensics in a 10G World,” to learn more about how you can better perform network monitoring and analysis at 10G.

Best Practices for Managing Colossal Networks

40G is more than just a bigger pipe; it introduces significant new challenges in monitoring and analyzing data traversing the network. You can no longer employ the “break/fix” or “point and shoot” troubleshooting techniques used in the past after problems have already been reported. These high-speed networks require proactive, ongoing network monitoring and analysis to keep them performing as designed. And of course your tools must evolve just as rapidly as your network, which is certainly not always the case.

Monitoring and analysis of 40G networks requires updated tools as well as new strategies and approaches. Let’s take a look at some of the key, though perhaps not so new, strategies that must be employed when considering how to monitor, analyze, and troubleshoot a 40G network.

Capturing All of the Data – All of the Time
Performing on-the-fly analysis or trying to recreate problems that you missed the first time around is no longer feasible on high-speed networks. It is essential to capture network data 24×7, and store as much detailed data, down to the packet level, that you can. By doing so, you have a recording of everything that happened on the network, and you can rewind the data at any time to analyze a specific period of time, usage of a specific application, activity on a particular subnet, or even the details of a specific network flow. To do this effectively, we suggest purchasing a purpose-built network forensics solution, one that is specifically designed for high-speed networks, and that also includes a rich set of real-time statistics. This will help keep all of your data into a single repository for easy post-capture analysis.

Your network forensics solution may not be the only appliance that needs access to the 40G network stream. One way to simplify the collection of 40G network data for detailed analysis is by using an aggregation tap instead of connecting an appliance directly to the 40G network via a dedicated tap. This will provide significant flexibility when dealing with the 40G stream. You can just replicate the 40G feed to multiple network tools, or you can use the built-in filtering to send subsets of the traffic to different network tools, depending on your data analysis needs.

Storage capacity is a primary concern when performing network recording. Let’s say your average usage on your 40G link is 25%, or 10Gbps. At this data rate, assuming a network recording appliance with 32TB of storage, you can record 7 hours of network data. An aggregation tap can also help here, allowing you to split the data stream among multiple network recorders to achieve higher overall storage rates. Another option is to connect your network recorder to a SAN for additional data storage.

Understanding What is Normal
Knowing how you expect your network to be perform is all the more critical when trying to analyze colossal networks. In advance of an investigation, you’ll want to establish clear base lines of your network. If you’re already embroiled in a complex network analysis firefight it is too late to realize that your ability to assess “normal” conditions on the network may be lacking.

Analyzing the Essentials
When faced with an issue on your network, you’ll want to first analyze the essentials. The temptation is to try to capture and analyze everything, especially when the source of the problem is not immediately known. You do, however, know certain things about your network, which allows you to be selective in the analysis options you choose. Often a variety of conditions can be immediately ruled out, and using these clues to limit the collection and analysis to only what is necessary dramatically improves network analysis performance. For example, if you’re looking at a 40G network link, you’re probably not capturing wireless traffic, so you can turn off the wireless analysis. Turning off analyses that aren’t relevant to your investigation refines your search, making it more specific, and increases the processing power and throughput of the appliance you’re using.

Knowing the Limits
Even after analysis has been streamlined to only essential areas of the network, data capture for network analysis on 40G networks generates a great deal of data quickly, and managing the data becomes a significant challenge. Effective analysis requires that you know the limits of your tools, not just the available space for storage, but the processing limits of your appliance as well as how many users can access the appliance concurrently and perform analysis.

Moving from 1 to 10 to 40G introduces new challenges that are still being worked out in the industry, especially when it comes to support for network monitoring, analysis, troubleshooting, and security tools.

If you are in the midst of an upgrade or are thinking about upgrading to 40G, be sure to include the correct tools in the upgrade plan and budget, including solutions for establishing network baselines, capturing and storing the data 24×7, and performing network forensics as needed. It’s easy to continue to treat these networks like 1G, but they’re vastly different and require a new strategies for analysis.

Why On-the-Fly Analysis Doesn’t Work at 10G

Remember when Pluto was a planet and performing ad hoc network analysis was the way it was done? Whether you have already made the switch to a 10 Gigabit (10G) network or you are about to, the way in which you monitor and analyze your network must change.

With traditional network analysis you had a lot of flexibility. Most times you simply connected your network analyzer if you had a problem, started a trace, and determined the problem. If the problem occurred in the past, you would attempt to replicate the problem and then solve the issue. But with 10G there is way too much data to attempt to reproduce problems for analysis.

So, how do you conquer this problem using the same equipment as you did with 1 Gigabit (1G)?

In most cases, you simply can’t. 10G and, in the future, 40G networks require different equipment to monitor 24/7. The days of using laptops and built-in network interface cards (NIC) are over. You need dedicated appliances that are purpose-built to monitor and analyze 10G networks on an ongoing basis. Reproduction is no longer a quick, smart, and feasible alternative on highly utilized 10G networks, so having a solid network forensics solution in place is essential, not only for uncovering security breaches (as most people think it is primarily used for), but also for examining common issues on your network, like spikes in utilization, drops in VoIP call quality, and increased latency – whether network or application.

Instead of having a point-and-shoot solution as you did with 1G, you need a different approach when handling your 10G networks. With 10G, you need to identify key analysis points, put equipment in place that can monitor 24/7 with alarms and alerts, and record network data at your peak data rate. When a problem is detected, you will already have the data stored and you can simply rewind data (rather than replicate the problem), analyze that data, and identify the root cause of the issue.

What does this cost? Is it worth it? It does require an up-front investment to monitor your 10G network, but it will save you big in the end by avoiding network downtime, improving tier one application performance, and increasing productivity within the company, with your network consistently running quickly and smoothly.

To check out more on how to correctly analyze and monitor at 10G, check out this short video.