Swimming downstream? (An Application Performance Management Analogy)

Here in British Columbia, we love the analogy of a salmon swimming upstream. It’s particularly relevant when you’re a small and driven team, out to convince the world that the network is an ideal source of data for application performance management (APM), security, and business analytics.Salmon Swimming Upstream

That’s why we’re delighted that a big player like Splunk has now also realized the value of APM and network data. Coming on the heels of ExtraHop’s $41M financing earlier this summer, I think we can safely conclude that transforming network data into intelligence for ops, security, and business users (or wire data analysis) is about to receive a lot more interest and attention.

It’s time to swim downstream for a while.

Except that stream is a bit of a misnomer. What’s really of interest on the network is a packet not a stream. Put enough packets together and you get a message, a request for service, or a response from a service. Look inside those messages and things get really interesting–that’s where you find purchase amounts, customer IDs, location information, and a whole bunch of other valuable data. Put these messages together and you get a transaction – a record of what a user was trying to achieve and how it went.

Look across previous blog posts, case studies, and press releases on our website and you’ll see a raft of examples of how customers are able to take advantage of highly processed network data from our INETCO Insight product.

  • leading bank uses real-time network data to adjust promotional activities and user experience at their ATMs
  • large card processor uses real-time network data to spot transaction problems before their customers do
  • credit union uses real-time network data to decide where to place ATMs based on what type of customers use them.

The Splunk App for Stream (naming issues aside) fills a big gap in Splunk’s data acquisition capabilities. Until now, they had no way to pull in detailed network data (what they call “wire data”), which is being recognized as an increasingly valuable source of intelligence. Of course, this detailed network data is the ultimate in Big Data, so incorporating it into Splunk will drive up indexing volumes significantly, which is how Splunk makes money.

In fact, there is a problematic trade-off Splunk customers will have to make between data quality and solution cost when using Stream capabilities. If they choose to index everything to build a valuable base of information for analytics, their Splunk license costs will grow astronomically. To help manage these costs, Splunk customers can index selectively by using the event filtering capabilities in Stream—though then they will miss all sorts of interesting information they just didn’t think to capture.

What happens when a business user needs some data for analysis that wasn’t captured? He or she has to make an IT service request, get a technical staff member to reconfigure the Stream filters, and then wait until enough data builds up again. This is exactly counter to the self-serve approach to business analytics most companies are striving for and exactly like the legacy approach to business intelligence that most are rebelling against.

That’s why we designed INETCO Insight to capture everything, all the time, in real-time. This way, our customers can instantly answer any question that might come to mind about their security stance, application behavior, or business performance. They want to spot hundred dollar problems before they become million dollar problems—waiting for a scheduled indexing process to run before data is available is unacceptable. We also price based on the complexity of the environment we’re monitoring, not the amount of data the customer wants to collect and retain. Our customers don’t have to sacrifice data quality in order to manage costs.

Moving out of the realm of architecture and pricing, there are a few other critical differences between Stream and INETCO Insight:

  1. INETCO Insight has a powerful correlation engine and Unified Transaction Model to reconstruct end-to-end transactions. This is a critical requirement for application performance management (APM).
  2. INETCO Insight automatically decodes hundreds of application protocols. With Stream, technical users must manually configure decoding of each application protocol. This makes for a much more fragile, harder-to-maintain system. If the Stream expert leaves the organization, a company’s entire business analytics infrastructure is at risk.
  3. INETCO Insight has a rich security model for handling all the sensitive information you find on enterprise networks and ensuring companies are compliant with PCI, and other privacy standards. Stream customers will have to build this model themselves in Splunk.

All of this attention on network data analytics comes at a perfect time for INETCO. While players like Splunk are focused on the blocking and tackling basics of network data capture, we’ve nailed these problems and are moving on to more advanced use cases in application performance, security, and business analytics. To see all of this in action, check out our recorded webinar on Unlocking Your ATM Big Data, or request a personal demo.