CEO Perspective: How to Accelerate Decision Making
From in-memory computing to stream processing, machine learning, and AI -- the world of analytics is changing quickly. We spoke to Kelly Herrell, CEO of Hazelcast, to find out how these technologies will help enterprises make decisions more quickly.
- By James E. Powell
- December 6, 2019
To stay competitive, it's more important than ever to make decisions more quickly with the most data available. Kelly Herrell, CEO of Hazelcast, reveals the must-have technologies to make better decisions faster.
Upside: What technology or methodology must be part of an enterprise's data strategy if it wants to be competitive today? Why?
Kelly Herrell: In-memory computing is the new, permanent layer in the IT stack. The data processing paradigm is changing, driven by digitization and the explosion of data. Insights and the actions they can inform are now perishable, whether it's preventing fraud or analyzing data streams in the very moment in which they occur. Digitization increases opportunities for new applications, but also places great pressure on eliminating latency.
The informational value within stored data remains high, yet the new demands require processing at millisecond latencies -- orders of magnitude faster than traditional databases. In addition, low-latency processing of streaming data is a rapidly growing requirement, enabling enterprises to capture insights as they happen. In both cases, latency is the new downtime, but both can be addressed with an in-memory computing platform.
What one emerging technology are you most excited about and think has the greatest potential? What's so special about this technology?
The new generation of stream processing is a game-changer. The previous era was "batch streaming" where a batch job was slowly loaded with data, after which it was pushed through a stream processing engine to derive insights. The evolution was "big batch" (Hadoop), then "micro-batch" (Spark); regardless, all data in the stream was historical. As a result, a massive amount of time-sensitive insight was forgone and the potential business value was never realized.
The new generation of stream processing is continuous, not batch. This enables the processing of data "on the wire," not just after it has first been stored. This capability will change the way we think about the business potential of applications.
Importantly, ML and AI will be highly correlated with continuous stream processing.
What is the single biggest challenge enterprises face today? How do most enterprises respond (and is it working)?
More than ever, time is money. Capitalizing on this fact requires a punctuated change in how to address it from an IT perspective. Some industries, such as financial services and large e-commerce organizations, live and die by the pursuit of reducing latency. Many of these enterprises have already fully embraced the era of in-memory computing and are deriving tremendous business value as a result.
Others are slower to adapt, unsure of the first step. The good news for them is that it doesn't require a "big bang" architectural change. Generally, an in-memory platform is inserted above the databases as a high-speed complement to the existing architecture.
Is there a new technology in data and analytics that is creating more challenges than most people realize? How should enterprises adjust their approach to it?
Machine learning and artificial intelligence are still in their "Wild West" phase in terms of maturity. Make no mistake, they are definitely real. However, the range of elements is still wide and technology perspectives still vary.
The best solid first step is to decide what processing platform to leverage. If value from ML/AI is to be realized, it will be because the processing platform enabled it. The platform should have three core qualities: the ability to process multiple model types/languages, the ability to run models against both stored and streaming data, and the ability to do so at extremely low latency regardless of scale.
By selecting the right processing platform, enterprises can put all of their budgeted focus on defining and generating models quickly and iteratively.
What initiative is your organization spending the most time/resources on today? What internal projects are your enterprise focused on so that you benefit from your own data or business analytics?
Our IT stack is entirely cloud-based, so our focus is on integrating those elements to produce an end-to-end, contiguous, and real-time view of our business. Doing it right requires a "think big, start small" approach.
Where do you see analytics and data management headed in 2019 and beyond? What's just over the horizon that we haven't heard much about yet?
ML and AI will dominate in terms of value discovery from data and analytics. These enabling technologies are here now. The combinatorial value of being able to apply ML/AI-driven computation across both stored and streaming data -- especially as data generation from IoT and edge computing take hold -- will produce a massive, long-term growth cycle in the analytics arena.
Describe your product/solution and the problem it solves for enterprises.
We have the industry's only platform for extremely low-latency data processing, at any scale, for stored and streaming data. Our customers are the largest in the world, leveraging our platform for applications in fraud detection, payment processing, e-commerce, healthcare -- the list goes on. When time is money, customers choose Hazelcast.
James E. Powell is the editorial director of TDWI, including research reports, the Business Intelligence Journal, and Upside newsletter. You can contact him
via email here.