TDWI Checklist Report | Using Streaming Analytics for Continuous Operational Intelligence
May 30, 2014
According to TDWI's 2013 survey on managing big data, roughly half
of user organizations surveyed are already managing and leveraging
streaming data that’s generated frequently or continuously by
sensors, machines, geospatial devices, and Web servers.1 However,
most of these users are today merely capturing and storing
streaming data for offline study, whereas they need to mature by
using real-time practices and technologies. This would enable them
to analyze streaming data as it arrives, then take immediate action
for the highest business value.
For example, consider some of the use cases that the real-time,
continuous analysis of streaming data is making a reality today:
- Monitor and maintain the availability, performance, and capacity
of interconnected infrastructures such as utility grids, computer
networks, and manufacturing facilities
- Understand customer behavior as seen across multiple channels
so you can improve the customer experience as it’s happening
- Identify compliance and security breaches, then halt and correct
them immediately
- Spot and stop fraudulent activity even as fraud is being
perpetrated
- Evaluate sales performance in real time and meet quotas
through instant incentives such as discounts, bundles, free
shipping, and easy payment terms
Compelling use cases such as these typically result from a "perfect
storm" of desirable data types, software functions, and fast-paced
business processes:
Streaming data. The swelling swarm of sensors worldwide
(plus the extended “Internet of things”) produces large volumes of
streaming data that can be leveraged for business advantage. For
example, robots have been in use for years in manufacturing; now
they have additional sensors that can perform quality assurance, not
just assembly. For decades, mechanical gauges have been common
in many industries (chemicals, utilities); now the gauges are
replaced by digital sensors and “smart meters” to provide real-time
monitoring and analysis. GPS and RFID signals now emanate from
mobile devices and assets ranging from smart phones to trucks to
shipping pallets so all can be tracked in real time and controlled
precisely.
Streaming analytics. The growing consensus is that analytics is
the most direct path to business value drawn from new forms of big
data, which includes streaming data. Existing analytic techniques—
based on mining, statistics, predictive algorithms, queries, scoring,
clustering, and so on—apply well to machine data once it’s
captured and stored. Luckily, newer vendor tools are re-engineering
these and creating new analytic methods so they can operate on
data that streams continuously as well as stored data.
Continuous analytics. Most analytic operations are scheduled to
run on a 24-hour or longer cycle. Getting the most out of streaming
data, however, requires analytics that execute or update every few
seconds or milliseconds to process each event, message, record,
transaction, or log entry as it arrives in case the new data signals
a business event that requires immediate attention. In other words,
continuous analytics go hand-in-hand with streaming data. Imagine
the results of a query incrementally updated with each new event
without needing to rerun the query against all pertinent data.
Likewise, continuous analytics may rescore an analytic model,
recalculate a statistic, remap a cluster, and so on but as efficient,
incremental updates, not execution from scratch.
Complex event processing (CEP). Event processing technology
has been applied to streaming data for decades, and a recent
TDWI Best Practices Survey shows that more than 20 percent of
organizations surveyed are doing event processing today in their DW/
BI solutions.2 However, traditional event processing tends to be very
simple, monitoring one stream of data at a time. The newer practice
of CEP can monitor multiple streams at once while correlating
across multiple streams, correlating streaming data with data of
other vintages, and continuously analyzing the results.
Operational intelligence. OI is a new form of business
analytics that delivers visibility and insight into business operations
and similar processes, as they are happening. This new class of
enterprise software includes all the capabilities discussed above,
but in a unified tool that empowers users to explore data streams,
understand business processes (as seen via data), model processes,
write rules for event-driven alerts and responses, and create full blown
business monitoring and surveillance applications. When
these applications run and respond continuously in real time, you
have continuous operational intelligence.
This TDWI Checklist Report examines the user best practices and
vendor tool functions for analyzing streaming data, with a focus
on those that enable new applications in continuous operational
intelligence.