CASE STUDY - Total Data Quality Just What the Doctor Ordered
Commentary by Dr. Wayne Bruce, Associate Dean, Northern Ontario School of Medicine
The Northern Ontario School of Medicine (NOSM) is mandated to educate doctors and offers an undergraduate MD program, various residency programs, and the Continuing Health Professional Education (CHPE) program for practicing physicians. To manage the CHPE program, NOSM developed a business application to administer information and activities such as course descriptions, participant registrations, and transcript management. However, the system was manually intensive, time consuming, and consisted of a combination of disparate Microsoft SQL Server applications, Microsoft Access data sources, and paper processes. We wanted to improve our program by creating a single application and completely automate the process of scheduling and managing our CHPE events.
Critical Need for Data Quality Processes
To accomplish this application development project, NOSM invited various systems integrators to complete requests for proposals (RFPs) to obtain bids for the project. A critical capability needed was expertise in the area of data quality, as the initial data to be consolidated was not standardized or well monitored.
Ultimately, two systems integrators who work with Melissa Data were awarded the project, in large part because of their expertise in data quality and the strength of the Melissa Data Total Data Quality Integration Toolkit (TDQ-IT) for Microsoft SQL Server Integration Services (SSIS). We found the TDQ-IT components provided the full spectrum of data quality needs within the SSIS data flow, and can simply be added to SSIS as drag-and-drop icons within Visual Studio.
The two integrators, Fox River Software and Actuality Business Intelligence, won the bid based on a proposal to build a custom application using Ironspeed software as the Web site application development framework, Microsoft SQL Server as the centralized database, SSIS as the data integration and workflow tool, and Melissa Data TDQ-IT as the data quality solution.
NOSM was able to integrate data from about 30 distinct source systems.
Specific Data Quality Processes in the NOSM Application
For our application we are using Melissa Data’s TDQ-IT to profile (identify data quality issues), cleanse (test to meet business rules), parse and standardize (restructure data into a common format), match (find unique identifiers and perform deduplication), enrich (phone and e-mail validation), and monitor (check conformance to data quality requirements).
As one example, we utilized the Melissa Data e-mail and fuzzy matching components to take data inputs from six completely different sources (including Microsoft Access, Excel, flat files, and Novell Groupwise), merge them together, validate the e-mail addresses, and create a resulting list of deduplicated e-mail addresses and names. Ultimately, we were able to cleanse and integrate data from approximately 30 distinct source systems, which are now centralized in a new, automated CHPE system, saving large amounts of time and money.
For a free white paper on this topic, click here and choose the title “Six Steps to Managing Data Quality with SQL Server Integration Services.”