Cohesive Information Integration – Blending ETL, Data Quality, MDM to Unify the Enterprise Information View
What is a “customer” or a “product,” how are these data concepts defined, and how many places are these data concepts inadvertently replicated across the enterprise? Most organizations have many different applications supporting the functional requirements of specific operational processes. But while the data may support immediate system needs, it is infrequent that the data is suitable for the needs of all downstream data consumers. And with greater need for data sharing, this means that variant structures, data models, conflicting definitions, and misaligned semantics for commonly-used data concepts will confound data consolidation attempts.Inaccurate, incomplete, untimely, and inconsistent data contribute to many business impacts, ranging from impairing the sales process, customer retention issues, increased operational costs, non-compliance, and exposure to leakage and fraud. In this webinar, we explore ways to characterize enterprise requirements for a “service level” for information, and how data integration techniques incorporate standard (industry-based) data models (for both operational and analytical computing), ETL, data standardization, enhancement, identity resolution, and master data management to provide a consistent, unified view of key data concepts. We then look at how those techniques are critical to real-time operational business intelligence, and how data quality metrics can be incorporated with data federation to provide continuous assurance that the information satisfies enterprise needs.
You will learn:
- The spectrum of techniques for data preparation
- Meeting the demand for high quality data
- Unifying the view o core master data concepts