Solving Modern Data Integration Challenges with an Enterprise Integration Fabric
TDWI Speaker: David Loshin, President of Knowledge Integrity
Date: Wednesday, January 22, 2020
Time: 9:00 a.m. PT, 12:00 p.m. ET
Throughout the history of data warehousing, data integration processes have been critical for sheltering transaction processing systems from additional processing loads to maintain performance service-level agreements (SLAs)—and for ensuring that data originating from different sources could be combined to enable more robust and complete enterprise reporting and analytics. Many existing data warehouses have relied on a conventional data integration process: extract data from the sources, move the extracted data sets to a staging area, standardize and cleanse the extracted data, then schedule bulk data loads into the target data warehouse.
However, these decades-old techniques for data integration are rapidly becoming obsolete as organizations adopt hybrid architectures to support modern real-time analytics and machine learning projects. Whether you need to synchronize data across on-premises/cloud boundaries, integrate data from SaaS/PaaS providers, or ingest and process streaming data in real time, the traditional extract, transform, and load (ETL) approach is insufficient to enable real-time predictive and prescriptive analytics. In this webinar, we discuss how traditional data integration may be impeding modern analytics and explore new ideas for using an enterprise integration fabric to modernize data integration.
Attendees will learn how to:
- Assess the scope of enterprise data integration
- Categorize data access use cases
- Establish data integration SLAs to support modern architectures
- Integrate new techniques for ETL and change data capture (CDC)
- Leverage visualization and low-code/no-code methods to simplify data integration development
David Loshin
Provided by
Equalum