TDWI Articles

Executive Perspective: DataOps in Volatile Times

Chris Bergh describes himself as "CEO and Head Chef" of DataKitchen, a company that enables analytics teams to iterate and innovate with their DataOps platform. He shares his enthusiasm for DataOps in this interview, explaining what makes it so popular and what's driving its adoption, as well as how to get the most from AI.

Upside: Of all the predictions for 2020 that you read, which stood out to you?

For Further Reading:

To Maximize Your DataOps Future, Take Advantage of These Four Trends 

Best Practices for Data Management with DataOps

Reducing Data Friction Just One of Many DataOps Benefits

Chris Bergh: DataOps, of course! We've had a front-row seat watching companies apply DataOps methods to data analytics. There has been no single idea that has been as effective as DataOps at improving efficiencies and encouraging creativity in analytics teams.

We recently worked with a company to reduce the time required to provision new analytics development environments from 20 weeks to one day. An automated orchestration capability such as this slashes cycle time and decreases the cost of innovation. When cycle time is dramatically improved, an organization can explore dozens of analytics ideas in the time it used to take to implement one. That translates into a competitive advantage relative to peers.

After Gartner named "DataOps" to the Hype Cycle for Data Management, we saw a surge in activity. Industry experts predicted that DataOps would go mainstream in 2020.

Have those predictions proved accurate so far?

DataOps interest remains strong, but many of the other predictions for 2020 are out the window. The recent market volatility has completely changed the landscape for business. Strategic plans that were approved late last year are being scrapped. Experience from previous recessions suggests that C-level executives will defer discretionary spending and focus resources on pragmatic initiatives that directly contribute to top-line growth and bottom-line efficiencies. DataOps impacts both, so many DataOps initiatives are being accelerated.

Why will the shift be faster? What's driving this shift and what's the urgency?

Data professionals know that data-driven decision making is now more critical than ever. Data analytics can serve as a company's most valuable tool when evaluating proposals for cost-cutting or investment. Analytics could mean the difference between finding the right mix of strategic moves and falling behind. The speed at which data teams can react to analytics requests is now serving as a differentiator that separates market leaders from laggards. CDOs are responding to this pressure by investing in DataOps automation that streamlines the processes and workflows related to analytics development and operations.

You've told me that you think AI will continue to underwhelm us this year. Why?

AI and data science are all the rage, but there is a problem that no one discusses. Machine learning tools are evolving to make it faster and less costly to develop AI systems, yet deploying and maintaining these systems over time is getting exponentially more complex and expensive. An AI model itself is only a small fraction of what it takes to deploy and maintain a model successfully in a business application.

Data science teams are incurring enormous technical debt by deploying systems without the processes and tools to maintain, monitor, and update them. Further, poor quality data sources create unplanned work and cause errors that invalidate results. A couple of years ago, Gartner predicted that 85 percent of AI projects would not deliver for CIOs.

The data industry needs to learn from other industries. When DataOps is applied in data analytics and AI, it will deliver the same game-changing improvements as agile and DevOps did for software development and as lean manufacturing did for countless factories in other industries.

How can an enterprise get the most benefit from AI?

Companies can improve their rate of success deploying AI and machine learning by forgetting for a moment about model creation and focusing instead on the processes and workflows related to model development, deployment, maintenance, and monitoring.

Enterprises must start by rebuilding the factory that creates and manages the models. This means eliminating errors, implementing continuous deployment, orchestrating operations, automating environment creation, and monitoring development and operations. When the model factory runs like a finely-tuned machine, the AI and models it produces will deploy seamlessly and will have a higher probability of meeting business objectives. With reduced cycle time, data teams will be able to iterate more quickly toward better-performing models or rapidly adjust their models to changing environments.

In such times of upheaval, where customer behavior is often unlike anything in the recent past, does DataOps play any role in helping us better collect or analyze unpredictable data?

Like lean manufacturing, DataOps relentlessly focuses on eliminating waste. The outcome of this deliberate process is extremely fast cycle time, rapid, automated deployment, and the virtual elimination of errors. DataOps applies filters and controls to data at each stage of processing so errors are trapped and remediated, eliminating a major source of unplanned downtime.

When data organizations can respond quickly and robustly to fast-paced market conditions, they are much more likely to produce game-changing insights that achieve the enterprise's objectives. They can iterate many times on a problem while their slower competitors labor to get started. Data organizations that are bogged down with inefficient manual processes and errors and suffer from a lack of transparency into their operations won't be able to keep up with rapidly evolving market conditions. We are seeing a lot of that recently as markets undergo unprecedented volatility.

Mid-year is a good time to evaluate past predictions and look ahead to the next six months. Based on what you've seen so far this year, what do you predict for the rest of 2020?

The coming months are going to test the capabilities of data organizations. Markets will change unpredictably. Executives will want fast responses to their analytics requests. As organizations face pressures, tempers are going to get short. Leaders of data organizations must prepare for this critical phase by slashing cycle time.

There are several drivers of lengthy cycle time, including errors, unplanned work, and manual processes. In a recent report, IT research firm Gartner wrote that data teams are spending an average of 56 percent of their time manually executing and maintaining the data operations pipeline. This is a travesty.

When data-driven decision making can make all the difference, data engineers, analysts, and scientists are an enterprise's most precious resource. Organizations need them focused on creating new analytics that will guide the enterprise through turbulent times. These dynamics will shift focus away from the "latest shiny tool" and toward reducing cost and producing better outcomes in the way that data teams develop, deploy, test, monitor, collaborate, and measure their analytics operations. DataOps is a highly effective method for pursuing these objectives.

About the Author

James E. Powell is the editorial director of TDWI, including research reports, the Business Intelligence Journal, and Upside newsletter. You can contact him via email here.


TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.