RESEARCH & RESOURCES

Q&A: Solid Value Proposition a Key to MDM Success

MDM, data quality programs need value proposition to succeed

Failure to build a solid value proposition -- a clear statement of what you plan to achieve -- can undermine your master data management (MDM) or data quality initiative. In this interview, MDM expert David Loshin explains how -- and why -- to establish a good value proposition, stressing the importance of measuring performance before, during, and after implementation. “If you don’t understand the degree to which the business is negatively impacted by poor data management practices,” he says, “then you’ll be limited in the degree to which you can measure success.”

A consultant and thought leader in BI, data quality, and master data management, Loshin is president of Knowledge Integrity, Inc.. He is the author of articles and books on data management, including the best-selling Master Data Management, his most recent book is Practitioner’s Guide to Data Quality Improvement. Loshin is a frequent speaker at conferences and other events, and will discuss data governance and architecture at a TDWI Webinar on Feb. 24. (). He can be reached at [email protected]

BI This Week: What is the value proposition for data-oriented programs such as MDM or data quality?

David Loshin: That is the single most significant challenge in initiating a data-oriented program -- establishing a firm value proposition. Your question, in fact, is the one that I’m asked most frequently when I’m speaking at an event or Web seminar. By value proposition, I mean a clear statement that indicates exactly what benefits a program such as MDM or data quality will deliver to your company -- and at what cost and effort.

Some organizations have a level of maturity in which key business stakeholders recognize the general value of establishing good data management practices. In those organizations, that may be sufficient, so there’s no need for further effort to establish a value proposition. However, most companies have a less strategic bent, and in those cases, a more detailed argument for a new program has to be made. Certainly, we’ve heard people talk about “data as a corporate asset” and the need for “a single source of truth” for critical data domains. However nice these phrases sound, they aren’t value propositions and are insufficient for establishing a business case that can withstand scrutiny.

The issue of clearly stating the value proposition is compounded when you look at the circumstances. Often, the people tasked with developing a value proposition as part of a business case are technical staff members with little or no experience on the business side. To address this, our company’s approach is to draw a tie between information utility and the organization’s own core business value drivers.

We start out with a high-level framework focusing on four key areas of value: financial, risk, productivity, and trust. The next step is to look at the types of challenges the organization is facing within each of these high-level areas, and then consider either how they are impacted by the absence of, or can be enhanced by, a data quality program or a unified view of master data. Iterative analysis can lead to identifying key measures that are directly related to opportunities for data-related improvements such as data quality and MDM.

For example, one organization we worked with found that their business forecasting was affected because they had multiple data sets associated with leading indicator data. As a result, the business planners saw a different calculation of market share than the sales and marketing teams because of discrepancies between their underlying data sets. This was impacting the business in many ways. At one level, it led to spending from the marketing budget in regions that were not going to yield sales. At a very high level, it could have led to missed quarterly sales forecasts, thereby artificially depressing the company’s equity value in the stock market.

In summary, if the business is impacted by data discrepancies and information processing failures, our approach can be used to identify key value drivers and their corresponding relation to improvements in data management such as data quality management and MDM. I have provided a good introduction to this approach in the first and fifth chapters of my book The Practitioner’s Guide to Data Quality Improvement.

Given that, how can the business value of MDM -- or of a data quality program -- be measured?

Let’s take a business-value driver approach. In that case, we can look at any key business performance metric impacted by less-than-stellar data management techniques and speculate on the anticipated “lift” from improved data management.

Another example is customer attrition. One organization believes that when issues relating to customer identity management (such as seeing one customer’s charges incorrectly added to another customer’s statement) are discovered by the customer rather than the company itself, there is a greater likelihood that the customer will sever the business relationship. Often, these errors are attributable to identity-resolution failures. They might be either false positive matches, in which the data associated with two different customers is incorrectly combined into one record, or false negatives, in which more than one record exists for a single customer.

Measuring the number of times a customer is not retained due to an identity resolution failure would establish a baseline, allowing you to calculate an anticipated increase in retention when a master data management framework eliminates those identity resolution failures.

That sounds fairly straightforward. Why is it so difficult for companies to understand and to measure the value of their data programs?

I think it goes back to the same principle: groups often get limited buy-in for attacking a problem using speculative technologies. They don’t validate that those technologies will fundamentally address the root cause of the problem. In one case, for example, the client wanted to deploy an MDM program because they wanted to engage in more cross-selling of their products. However, one reason they weren’t able to effectively cross-sell had to do more with the lack of training for their sales teams than with access to master data. Introducing MDM might help, but absent any significant increase in sales, the technology is first in line to be blamed.

Essentially, if you don’t understand the degree to which the business is negatively impacted by poor data management practices, then you’ll be limited in the degree to which you can measure success. That is why we advocate a practical approach that looks at correlating business opportunities to information utility. In fact, a business model built on quantifiable business measures provides three excellent artifacts: a baseline measure of the current state, defined success objectives, and most important, quantifiable metrics to continuously measure improvement.

When you talk to your clients, where are they in the process of implementing an MDM or data quality program and in measuring the program’s true value?

Interestingly, we are often not the first consulting company they’ve worked with on data management improvements. Organizations often have attempted to implement data quality or data governance before but have been stymied by what we call “failure to launch.” They start out with limited value propositions, grudging support from a selected number of business partners, and little or no perceptions of the systemic issues associated with the people and processes that need to accompany technical improvement projects.

As a result, lately we have seen a lot of clients refining their focus on the initial stages of the programs. Those include an assessment of the business impacts of poor data quality, data quality assessments, deriving business-oriented data quality metrics, and harnessing methods for reporting data quality measures. It also includes engaging the business side to understand why additional investments in data quality or master data management can add value.

A number of these clients have actually progressed in their processes, often as a result of providing these measures. In fact, some of our clients have started to present some of their results at various venues. Readers interested in hearing more can contact me directly at [email protected] or go to http://www.dataqualitybook.com, which is a Web site I set up to accompany my book on data quality.

What can you suggest as a good first step (or steps) toward a data quality or MDM program, including measuring its value?

It’s hard to name a single specific thing because there are lots of potential activities. One good first step, however, would be to take a first cut at documenting the data domains that are commonly used across the organization. That can range from high-level concepts such as customer and product, to lower-level notions such as reference data concepts and hierarchies (such as location or gender), or even conceptual domains that comprise the value concepts stored within your data sets.

As a second step, I might couple this with a semantic analysis to ensure that the data items that share names actually refer to the same real-world concepts. Third, identifying which business processes use those data domains will provide a starting point for soliciting data expectations from the appropriate set of business data consumers.

TDWI Membership

Get immediate access to training discounts, video library, research, and more.

Find the right level of Membership for you.