Over the last few years there’s been no shortage of talk about Big Data. It’s being used across all industries for organizations looking to improve their business intelligence, their cyber security, and their application performance monitoring. In a report by McKinsey it was found that retailers fully leveraging Big Data could increase operating margins by 60%. Given these types of numbers, it’s no surprise that financial institutions are looking for ways to mine and manage their Big Data—and there’s no shortage of IT management systems for them to choose from.
They can use the cross-vertical Big Data solutions created by IBM and HP (Tivoli Netcool and Operations Manager, respectively). New firms like Splunk, Loggly, and Sumo Logic also offer slightly different flavours of Big Data management. And for those focused on managing Big Data from ATMs, there’s the NCR Aptra Vision solution.
So while banks and other financial institutions have lots of choices when it comes to systems to help them make sense of Big Data—ultimately, these solutions are only as good as the data they have access too; data, which for the majority of the time, comes in as logs. Unfortunately, logs have limitations…
Find out more in this article from Financial IT.