The IT world is currently in the throes of a huge shift — a seismic shift that the industry historically experiences every five to ten years. Today, the entrance of technologies like mobile, software defined networks, virtualization, and cloud computing have changed the landscape for both the consumer and the enterprise.
Change is of course inevitable, and welcome, and while all of these technologies have either great potential or are already helping to fuel better productivity within IT, there are many unanticipated challenges cropping up. Below we take a look at some of the challenges these top trends are introducing, and how to adjust so your organization can get the most from these new technologies.
Software Defined Networks and OpenFlow
Software defined networks (SDN) and OpenFlow have been touted as enabling technologies that will help decrease the complexities of cloud and virtualization. SDN defines the overall technology, while OpenFlow is a specific example of an SDN, and was created as a programmable network protocol to help manage and direct traffic among switches from an assortment of vendors. Ideally this would provide centralized control and easier network management of potentially cheaper switches without the single-vendor lock-in.
However, these technologies present potential challenges for network engineers. While the promise of centralized network control sounds good in theory, the migration to OpenFlow requires creating new network-wide policies. It’s likely that we’ll hear about large “failed” OpenFlow deployments, where the amount of effort overwhelms the projected ROI. The challenge for OpenFlow now is to live up to the hype: deliver demonstrable performance improvements without requiring a forklift upgrade of the network core. While it’s exciting that OpenFlow has lots of potential, if it’s too hard to deploy, it will never truly leave the research environment where it was born.
For more details on the history and use cases of SDN and OpenFlow, check out our blog “Software-Defined Networking and OpenFlow to Infinity and Beyond.”
More and more companies are turning to virtualized environments to streamline application deployment, to simplify IT operations and to allow IT organizations to respond faster to changing business demands. With decreasing prices and an increase in administrative tools that make management easier, virtualization is now being adopted even by smaller mid-market organizations.
But virtualization creates “blind spots” in your network, areas where application traffic cannot be properly monitored with traditional techniques, opening the network up to undetected application performance problems. In a traditional server environment, you would normally span a switch port from a physical Ethernet switch or router and the data would stream across into a network/application performance analysis appliance, providing complete visibility. But in the case of a virtual environment, data comes back through a virtual adapter without actually hitting a physical switch. This creates a blind spot in your appliance and the communication between virtualized applications on the same server is never seen.
In order to combat this blind spot and successfully perform network analysis in a virtual environment, you must plan ahead. Although there is no big difference in network analysis techniques in a virtual environment, there is in the implementation. Instead of capturing data at the physical layer, you must be prepared with a solution that can collect data at the level of the virtual switches.
Mobile and Wireless Networks
In today’s digital age, wireless networks are essential to both businesses and consumers. However, maintaining strong performance and security of wireless networks can be difficult — especially in the era of BYOD (Bring Your Own Device). And keeping up with the pace of technology can also be challenging, with 802.11ac and 802.11ad right around the corner.
The introduction of wireless-enabled smart phones and tablets has ushered in new challenges for wireless network management, most importantly in the areas of security and performance. Now on top of dealing with the authorized workstations, network admins must account for and secure a whole new set of devices, which are not within their direct control. And when it comes to performance, not only do more devices make for a more congested wireless network, but a powered on, inactive smart phone that is not connected causes at least ten times as much damage to your Wi-Fi network as the same phone when it is connected (see http://www.sniffwifi.com/2012/04/phones-on-wlan.html for all the details).
You need a full-featured wireless network analysis solution in place and monitoring your network 24×7, searching for unauthorized devices and analyzing overall network conditions, like excessive probe requests/responses that can drag down your overall aggregate WLAN throughput. You also need a solution that will future-proof your investment as 802.11ac and 802.11ad begin to take hold.
Change is always a mix of good and bad, but with these new technologies come a plethora of new opportunities. In order to stay ahead of the curve it’s important to know how and when to adapt, as well as which tools will help you get there.