Getting Decisive on Real-time Analytics
By Scott Jarr, Co-founder and Chief Strategy Officer, VoltDB
We are enjoying a golden age of analytics. For the past 15 years, smart companies have systematically used analytics to advance the pace of business and serve their customers in ways never before thought possible.
But hang on. This nice, calm evolutionary path is about to "tip" into one of huge change and disruption.
The role analytics plays in helping companies make better decisions is, in fact, fundamental to understanding this impending tipping point. Before examining that relationship, it's necessary to first review the history of database development.
In the early days of databases, everything was stored in one relational database management system (RDBMS), and transactions and analytics (such as they were) were run from that one system. The demands of both analytics and transaction processing forced a separation into multiple systems. Initially, these were using the same RDBMS software: one instance for transaction processing and one for analytics. Simply stated, the stores were split.
Once analytics could be run without impacting transactions, every business user, business analyst, and product manager wanted reports faster and on more recent data, which drove data warehouse development. Then MPP data warehouses were built for even faster reports. Indeed, it seems that the desire for faster, more current reporting has become insatiable.
Each subsequent iteration of data warehouse technology came with incremental improvements, including faster report runtimes with more recent data factored into those reports. This natural, smooth evolution has continued to where we are today: the availability of real-time analytics. Some systems are now sufficiently fast so that they can actually collect the data as it enters the enterprise and provide user visibility into that data almost instantly. Although some argue about how close to "time zero" one needs to get to be called "real time," that is mere detail and irrelevant to the larger impact.
As long as we think strictly about the natural progression of analytics getting faster, we see only steady evolution. The tipping point becomes readily apparent when we look at two facts occurring around us today: (1) new data sources and (2) how we utilize the reports being run faster.
To the first point, data is undeniably arriving faster and from more highly automated sources than ever before. As a result, our systems are pushed to ingest data at high velocity, to make more decisions faster, and to present analytics in real time. These data sources represent the catalyst that is changing what we consider "valuable information" within our organizations. Whether they are user click streams, financial trades, personalized interactions, mobile records, or sensor readings, organizations of all types are now ingesting data and striving to eke out the most value from it -- because if they don't, their competitors surely will.
Which brings us to the second point. Reports are run for a single purpose: to make better decisions. Today, decisions based on a report occur because a human being examined the report and made a decision. When reports are generated from the previous week's data, the time it takes a human to process the analytic report and make a decision is inconsequential to the overall process.
Let's analyze that process. A report is run, a human reads the report, evaluates the data, makes a decision, then acts on that data. From the time the report gets into someone's hands, minutes elapse before a decision is made.
To illustrate that point, just a few years ago, a manager would receive a monthly report on inventory by individual item (or stock-keeping unit, aka SKU). The decision-maker would be required to choose what to purchase and ship to particular warehouse locations, and then enter that data into a system to affect inventory control. The five minutes taken to make those decisions had an inconsequential impact on the process because the decision-maker was working on reports that represented month-long periods. The data was slow.
Now consider the reporting process in the face of new, fast, relentless data sources. The inventory example is no longer representative of the new age (online transaction processing) OLTP problems. Today, digital marketing programs are serving $20,000 worth of advertisements in a minute. That five minutes the manager needed to make a human decision just cost his organization $100,000!
Real-time analytics has provided visibility into these fast-moving events as they occur. More than anything, that visibility shines a bright light on the need to automate those decisions to the fullest extent possible. The evolution of analytics to real-time will lead to the necessity of businesses taking action on those analytics in a timeframe that doesn't penalize the process. Essentially, if the analytics occurs in real-time, the decisions must be made in real-time.
Analytics is Not Enough
Clearly, all corporate decisions cannot be made with an automated software process. However, where sources are generating data at unprecedented velocity, combining real-time analytics with real-time decision-making is a competitive edge many organizations are striving for.
Real-time analytics alone is not the end game. We are experiencing the beginning of a major transformation in which analytics starts to become fundamentally connected to making better real-time decisions, and these two processes become indistinguishable from one another.
Organizations kicking off projects dealing with fast inbound data should anticipate being asked to make decisions (perform transactions) on that data as it arrives. It hasn't yet happened at every business, but there is no question real-time decision-making will eventually be added to the list of requirements.
In fact, this is such a strong trend that in the next 12 months we will see the end of "real-time analytics-only systems." There just isn't enough value unless you can use that system to run decisions as well.
Scott Jarr is co-founder and chief strategy officer at VoltDB, a company which provides a NewSQL in-memory relational database that offers high-velocity data ingestion, ACID compliance, high scalability, and real-time analytics. Scott has more than 20 years of experience building, launching, and growing technology companies from inception to market leadership in highly competitive environments. You can contact the author at email@example.com.