The “third platform”—defined by IDC as the next-generation IT software foundation that includes cloud computing, Big Data, social engagement and mobile—has disrupted IT far faster than either of the first two platforms, mainframe and client/server/Internet technology. This disruption will profoundly affect all organizations as they shift their IT focus to scale up their cloud, mobile and Big Data capabilities.
This current platform generates unheard-of volumes of data, all of which needs to be analyzed, secured and managed, in real time. To meet this challenge, organizations need to make use of the emerging technologies of software acceleration platforms and tools. These new tools are necessary to maximize engagement and value for organizations looking to enable innovation and remain ahead of the data explosion.
Guest article by Dan Joe Barry, Vice President of Positioning and Chief Evangelist, Napatech
As analyst firms monitor the ever-changing IT landscape, it is clear that regardless of the platform, or the means of delivery, the volume, variety and velocity of data in networks is continuing at explosive rates. Performance and application monitoring is turning into a pressure cooker as network engineers work to deliver these massive data streams in real time, with multiple usage crises dragging down network performance at any given time.
What Appliance Design Requires Today
The third platform brings with it a need for software acceleration and support across cloud, Big Data, mobile and social venues. To address this need, hardware acceleration must be used to both abstract/de-couple hardware complexity from the software while also providing performance acceleration. De-coupling the network from the application layer helps to realize this focus, while at the same time opening appliances up to opportunities that support new functions that are not normally associated with their original design.
The benefits of using high-performance network adapters are significant. With their assistance, administrators can identify well-known applications in hardware by examining layer-one to layer-four header information at line-speed. By identifying what is performed in hardware and what is performed in application software, more network functions can be offloaded to hardware, thus allowing application software to focus on application intelligence while freeing up CPU cycles to allow more analysis to be performed at greater speeds.
Another benefit is the enabling of massive parallel processing of data, as hardware that provides this information is used to identify and distribute flows to up to 32 server CPU cores. All of this needs to be provided with low CPU usage. Appliance designers should consider features that ensure as much processing power and memory resources as possible and identify applications that require memory-intensive packet payload processing.
Improving Analysis Capabilities
Many products exist currently to tackle the problem of downstream analytics in a voluminous environment; however, the ability of these tools to perform real-time analysis and alerting is limited by their performance. Solutions that are used to extract, transform and load data into downstream systems tend to increase the latency between data collection and data analysis. Moreover, the volume and variety of data being ingested makes it impossible for analysts and decision makers to locate the data they need across the various analysis platforms.
Third platform activities will accelerate when intelligence is pushed to the point of data ingress, improving real-time analysis capabilities. Best practices include:
Directing Data Downstream
By inspecting data the second it enters the network, data flow decisions can be made to direct data to downstream consumers at line-rate. This minimizes the unnecessary flow of data through downstream brokers and processing engines.
Analyzing Perishable Insights
Organizations must begin data analysis as soon as data is received. This enables them to make use of perishable insights—that is, data whose value declines rapidly over time. Doing so ensures that an organization can begin acting on what is happening immediately.
Seeing what data is entering the system in real time, before it reaches decision-making tools, provides intelligent alerts to stakeholders, informing them of the presence of new data that is of interest for their area of responsibility.
New Technologies for the Data Age
Network applications that can perform these functions come at a price, which makes scaling up a costly proposition for cloud providers, telcos, carriers and enterprises alike. Even worse, if the market shifts toward adoption of novel network hardware, these organizations must bear the cost of updating legacy infrastructure in order to stay competitive.
Appliance designers can now dissociate application data and network processing and build in flexibility and scalability into the design. This gives them the ability to introduce a powerful, high-speed platform into the network that is capable of capturing data with zero packet loss at speeds up to 100 Gbps.
In addition to performance monitoring, the analysis stream provided by the hardware platform can support multiple applications. Multiple applications running on multiple cores can be executed on the same physical server with software that ensures that each application can access the same data stream as it is captured. This transforms the performance monitor into a universal appliance for any application requiring a reliable packet capture data stream. With this capability, it is possible to incorporate more functions in the same physical server, increasing the value of the appliance.
Software acceleration platforms and tools are laying the groundwork for organizations to manage the ever-increasing data loads without compromise. In order to enable a more robust network in today’s fast-paced environments and stay ahead of the ever-expanding data growth curve, network management and security appliances will now, more than ever, need to remain in front of advancing network speeds to ensure apps run quickly, videos stream smoothly and end-user data is secure and accessible on the third platform.
Read the Aberdeen report Optimize IT Infrastructure to Maximize Workload Performance
Daniel Joseph Barry is VP of Positioning and Chief Evangelist at Napatech and has over 20 years experience in the IT and Telecom industry. Prior to joining Napatech in 2009, Dan Joe was Marketing Director at TPACK, a leading supplier of transport chip solutions to the Telecom sector. From 2001 to 2005, he was Director of Sales and Business Development at optical component vendor NKT Integration (now Ignis Photonyx) following various positions in product development, business development and product management at Ericsson. Dan Joe joined Ericsson in 1995 from a position in the R&D department of Jutland Telecom (now TDC). He has an MBA and a BSc degree in Electronic Engineering from Trinity College Dublin.