By using website you agree to our use of cookies as described in our cookie policy. Learn More


TDWI Blog: Data 360

Blog archive

The Three Core Activities of MDM (part 3)

Blog by Philip Russom
Research Director for Data Management, TDWI

I’ve just completed a TDWI Best Practices Report titled Next Generation Master Data Management. The goal is to help user organizations understand MDM lifecycle stages so they can better plan and manage them. TDWI will publish the 40-page report in a PDF file on April 2, 2012, and anyone will be able to download it from In the meantime, I’ll provide some “sneak peeks” by blogging excerpts from the report. Here’s the third in a series of three excerpts. If you haven’t already, you should read the first excerpt and the second excerpt before continuing.

Technical Solutions for MDM
An implementation of MDM can be complex, because reference data needs a lot of attention, as most data sets do. MDM solutions resemble data integration (DI) solutions (and are regularly mistaken for them), in that MDM extracts reference data from source systems, transforms it to normalized models that comply with internal MDM standards, and aggregates it into a master database where both technical and business people can profile it to reveal duplicates and non-compliant records. Depending on the architecture of an MDM solution, this database may also serve as an enterprise repository or system of record for so-called golden records and other persistent reference records. If the MDM solution supports a closed loop, records that are improved in the repository are synchronized back to the source systems from which they came. Reference data may also be outputted to downstream systems, like data warehouses or marketing campaign systems.

MDM solutions also resemble data quality (DQ) solutions, in that many data quality functions are applied to reference data. For example, “customer” is the business entity most often represented in reference data. Customer data is notorious for data quality problems that demand remediation, and customer reference data is almost as problematic. We’ve already mentioned deduplication and standardization. Other data quality functions are also applied to customer reference data (and sometimes other entities, too), including verification and data append. Luckily, most tools for MDM (and related disciplines such as data integration and data quality) can automate the detection and correction of anomalies in reference data. Development of this automation often entails the creation and maintenance of numerous “business rules,” which can be applied automatically by the software, once deployed.


Keep an eye out for another MDM blog, coming March 16. I’ll tweet, so you know when that blog is posted.

Please attend the TDWI Webinar where I will present the findings of my TDWI report Next Generation MDM, on April 10, 2012 Noon ET. Register online for the Webinar.

Posted by Philip Russom, Ph.D. on March 2, 2012


Average Rating

Add your Comment

Your Name:(optional)
Your Email:(optional)
Your Location:(optional)
Please type the letters/numbers you see above.