Big data may have permeated the enterprise, but the space where human judgment meets data still needs work. The recent Target security breach and the GM court case are demonstrating the extreme effects of ignoring data—or else not paying attention to the right data. Target’s IT staff chose to ignore the breach warnings issued by the company’s security system. GM’s own data analytics systems found the ignition problems, but displayed them as a small enough part of the company’s overall complaints that they were deemed inconsequential.
Big data technology has evolved to a point where most or all pertinent data can be collected and stored. Putting that data into action is where the problems emerge. In addition to users overlooking or undervaluing important data, the data feeds on the back-end could be the wrong ones for the job. Google Flu, which has proved notoriously inaccurate in tracking annual flu epidemics, is one example of a system that used bad feeds.
The next evolution of data analytics is to balance the big data with the bad data: Find the right data and execute on it properly. The first process has to do with building a smart layer on top of data collection to better find and curate data, the way Metric Insights has with the KPI Warehouse. The second is to make data actionable by setting up alerts around data quality. When the data being pulled seems abnormally strange, it makes sense to alert users, with context, so that they can address the root problem. An example is to send an alert when there are significant changes in the volume of data from what is usually captured.
Bridging the gap between data collection and human intelligence is the final proving ground for data analytics systems. By combining automated alerts and personalized digests, among other features, we help to get our users working effectively with data and reduce the divide between people and analytics.