Since the dawn of enterprise digital computing, managers have been looking to computer operators to run data processing jobs on those original workhorses of computation. The goal: to generate reports that the managers would use to make decisions on how to run their companies or government agencies.
Today, those large binders filled with perforated pages striped in glorious green and white may be long gone, but the batch job – paper report – human decision pattern remains engrained on our managerial consciousness, for better or worse.
Cut to half a century later. We now have big data. Cloud computing. A wholly revamped notion of digital that is mobile-centric, omnichannel, and customer-focused. And yet, old habits are slow to die.
Take, for example, the first version of Hadoop. It was a batch job processing platform that would generate reports (paper optional, thankfully) that managers might attempt to interpret in order to make better decisions.
Yes, today the hardware is faster, the software is better, the data sets bigger – but the human context for data analytics in many ways remains ensconced in the Mad Men days.
Fortunately, this perception is changing. Real-time analytics technology is now a reality. Analysis of data about the past is still important, but insight about what’s happening right now is increasingly available to the decision maker, and it is thus up to that decision maker to understand how to leverage such insight to make good decisions – right now.
The True Value of Real-Time Analytics
The reason that it has taken so long for real-time analytics to become a useful tool is because of the bottleneck problem: every step in the data lifecycle must perform in real-time or the end result falls short.
Today all the pieces are finally falling into place. From data collection to movement to aggregation to processing to analysis to visualization, vendors and open source communities are stepping up to the plate and solving each bottleneck on this path to insight.
The end result – when everything works properly – can be astounding. A simple example: imagine moving the “buy” button to different positions on a live ecommerce web page while watching real-time purchasing data go up and down until you find the optimal location.
Now do the same thing for every other aspect of your business. That’s the power of real-time analytics.
Sounds good, right? But if that example leaves you scratching your head, you’re not alone. Only in certain still-rare circumstances are the benefits of real-time analytics so cut-and-dried. It’s one thing to shift a button around, but quite another to deal with the broad complexities of an enterprise business environment.
Perhaps the greatest challenge in the simplistic ecommerce example above is establishing the feedback loop. The traditional batch job/report/decision pattern ends with a human decision, not with an automated decision leading to an action that changes the data in real-time. That part of the real-time analytics pattern is still uncomfortably unfamiliar.
From Real-Time Analytics to Streaming Analytics
On the one hand, the importance of a feedback loop indicates how we must shift our thinking away from the traditional batch job/report/decision pattern to the real-time pattern in order to maximize the value we can squeeze from our analytics.
On the other hand, the feedback examples above reveal one remaining bottleneck: the human itself. As long as the goal of big data analytics insight is to fuel human decision making, then all-too-human limitations will continue to be our limiting factor.
Once we establish a fully automated feedback loop, however, then we take this final remaining bottleneck out of the equation, unleashing a new level of speed and performance.
We’re not just talking real-time analytics at that point. We’ve moved to streaming analytics.
The field of streaming analytics takes as its starting point not simply large data sets, but a never-ending, 24 x 7 fire hose of data. All of a sudden, making sure we have no bottlenecks simply becomes the price of admission.
We can’t even begin to deal with streaming data if we have to pause even for a moment to move it, store it, process it, or analyze it. The fire hose just keeps on streaming.
Today’s technology is finally piecing together the end-to-end components that make streaming analytics a reality. And yet, making sense out of the results of our streaming analytics – in real-time, without introducing a bottleneck – presents perhaps the greatest challenge of this approach.
Rising to this challenge is the field of cognitive computing – a way of analyzing streaming data that are multistructured, ambiguous, and in a constant state of flux.
The advantages are profound. Fraud detection and prevention, dynamic product pricing, Internet-of-Things (IoT) data analysis, electronic trading, customer promotion triggering, and compliance monitoring are some of the early examples of the power of streaming analytics – bolstered by cognitive computing to establish real-time, machine learning-based feedback loops that drive business value with no bottlenecks.
And you ain’t seen nothin’ yet.
Putting the Human Back into the Streaming Analytics Picture
Most people’s exposure to cognitive computing consists of watching IBM Watson trounce the world’s best players at Jeopardy! The same kind of technology is the key to fully automated streaming analytics – technology that gets smarter and smarter by itself.
If the notion of an intelligent, constantly learning, Jeopardy!-playing computer running the largest companies and governments in the world doesn’t frighten you just a bit, then you probably never watched the Terminator movies, or any number of other dystopian AI-run-wild flicks.
Well, you can relax – at least for now. Establishing fully automated streaming analytics feedback loops won’t take humans out of the picture. Rather, it raises the role humans play to a new level. Instead of Terminator, think Star Trek.
People, after all, must still build and manage the entire kit and caboodle. And for all its power and velocity, streaming analytics is still just a tool – a tool in human hands.
In order to play at this new level, however, we must learn new skills. Just as we’re struggling to move from batch job/report/decision thinking to real-time thinking, we must now take the next step: working with never-ending torrents of multistructured, dynamic data.
The Intellyx Take: Thinking at the Meta-Insights Level
In other words, we must raise the insights we obtain from streaming analytics to the meta level. The insights we gain aren’t simply about how best to run our business, as we’re automating how we use those insights. Instead, the insights we humans must gather are meta-insights: insights into how to leverage the ongoing insights the technology provides us.
Most people aren’t working at this meta-insight level. Our technology is advancing so quickly that we’re still struggling to move from batch thinking to real-time thinking. The result is increased turbulence, both in the business and technology domains. After all, change is ubiquitous and accelerating across the digital spectrum. It’s virtually impossible to keep up.
Dealing with change as the fundamental business constant, after all, is what the Agile Architecture Revolution is all about. As we remove the bottlenecks from our data processing and leverage the fire hose of real-time, streaming analytics, this revolution becomes an unavoidable business reality. Get ready for warp speed!
Intellyx advises companies on their digital transformation initiatives and helps vendors communicate their agility stories. As of the time of writing, none of the organizations mentioned in this article are Intellyx customers. Image credit: US Navy.