Why Enterprises Are Turning to DataOps
Why is DataOps so important for your enterprise data management tasks? We asked Chris Cook, CEO of Delphix and a veteran technology executive with more than 30 years of experience in the enterprise software industry, for his perspective.
- By James E. Powell
- November 1, 2019
Upside: What technology or methodology must be part of an enterprise’s data strategy if it wants to be competitive today? Why?
Chris Cook: Data management is as critical for companies today as software development. Businesses in the digital era must harness their data to power competitive advantages, inform decisions, drive new application features, and improve customer experiences. To do this, organizations are turning to DataOps to get the right data to the right place without sacrificing security or user privacy.
Similar to how DevOps arose to help companies ship high-quality software, faster -- DataOps has emerged to help companies solve the end-to-end delivery of data. DataOps aligns people, processes, and technology around the flow of data in the enterprise -- remedying the issue of slow, old, risky, and low-quality data that creates massive bottlenecks in software delivery and stymies business innovation.
With modern DataOps technologies, data delivery can be automated, made more secure, and managed (self-service style) in minutes -- bridging the gap between the need to safeguard data and maintain regulatory compliance and the need to use it to meet the ever-growing demands of the business.
At its core, DataOps is about overcoming the cost, complexity, and risk of managing data to meet the needs of modern business in the new, data-driven world.
What one emerging technology are you most excited about and think has the greatest potential? What’s so special about this technology?
The new few years will reshape the face of computing through IoT devices, blockchain, augmented reality, voice computing, and more, but no technology is more promising (or daunting) than artificial intelligence and machine learning.
Today, tech giants use real-time data to improve consumer experiences and drive industrywide change. Facebook determines what we see in our feeds with AI. Apple uses it to power iPhone features such as Face ID and Siri. Amazon sells us recommendations based on our browsing options that improve our shopping experience.
These types of applications require access to clean, secure, and representative data sets for training and execution. Yet for most organizations, too much important data remains locked inside disparate enterprise systems, unable to be utilized. They don’t know how much data they have, where it lives, what to do with it, or how to secure it.
Those who are poised to capitalize on AI innovation will be the companies who can manage the upcoming explosion of data scattered across multiple environments to feed their machine learning algorithms with fast, high-quality data.
What is the single biggest challenge enterprises face today? How do most enterprises respond (and is it working)?
For most businesses today, data is both their most important asset and the biggest challenge. It is the key to unlocking new insights, technologies, and operational efficiencies.
Yet despite its importance, many companies continue to struggle with the basics of data. Most haven’t yet mastered data plumbing -- the way data and services are delivered. Even in Fortune 500 companies, we often find that it takes several months to move data to the right environments in order to be used. They must consider the security, privacy, and regulatory compliance elements of data as well.
In many cases, these companies have no one clearly responsible for data. To solve this, organizations must look to DataOps, assembling data teams that are nimble and self-sufficient and can measure data plumbing metrics, set objectives, and invest in modern processes and technologies.
Is there a new technology in data and analytics that is creating more challenges than most people realize? How should enterprises adjust their approach to it?
As data continues to grow in size, volume, and complexity, it’s also becoming increasingly distributed across disparate environments. From physical data centers to managed service providers, SaaS vendors, and multiple public clouds, most organizations have a patchwork of data sources that support the business. This new hybrid, multicloud reality -- and how enterprises will manage it -- is a huge challenge for organizations seeking to reduce cost, scale their business, and improve performance.
Most digital apps and services can’t afford downtime, so enterprises must figure out how to sync and migrate live data across environments efficiently (or even in near real-time). They need technology capable of spanning physical data centers, co-location facilities, and all major public clouds. DataOps technologies solve these issues. The right technology can dramatically improve an organization’s ability to deliver fresh, secure data from anywhere and from any data source, on demand.
Where do you see analytics and data management headed in 2020 and beyond? What’s just over the horizon that we haven’t heard much about yet?
Whether they know it or not, every company today is a data company. More and more organizations are paying attention to data and their data practices as emerging disciplines such as DataOps continue to gain steam.
Data is the lifeblood of software development. Yet most companies, developers, and analysts don’t have access to strong, representative, or up-to-date data. Instead, they make do with stale, partial, and synthetic data that leads to mistakes, errors, and false positives and negatives. That’s going to change.
You’re going to start to see DataOps technologies that are ultra-lightweight, portable data environments designed specifically for developers to have self-managed, automated, and secure access in minutes. As the kings and queens of business behind today’s digital innovations, the future of data management will be empowering development teams with everything they need to make informed decisions and better products faster and more efficiently.
Tell us about your product/solution and the problem it solves for enterprises.
Delphix empowers businesses to accelerate innovation through data. The Delphix Dynamic Data Platform enables the free flow of data required to create products and make decisions faster and more efficiently while simultaneously managing data risk and security. Personalized DataPods -- automated, secure, and self-service data environments -- provide a comprehensive approach to DataOps that enables companies to easily deliver, manage, and secure data wherever it exists. Delphix grants organizations faster data access to fuel their most important digital initiatives without increasing risk, complexity, cost, or time.
James E. Powell is the editorial director of TDWI, including research reports, the Business Intelligence Journal, and Upside newsletter. You can contact him
via email here.