Speed, agility, and intelligence are competitive advantages that nearly all organizations seek. To seize these advantages, organizations require timely, diverse, complete, and accurate data. Unfortunately, traditional data warehouse extraction, transformation, and loading (ETL) processes are not fast enough. They put too much burden on ETL developers to understand every nuance of every data source, and it’s getting worse as Hadoop and other big data sources become part of the mix. How can organizations take advantage of new big data sources to deliver complete and diverse views of data—and get beyond the limits of traditional data warehouses?
Forward-thinking organizations are adopting a new architecture that has as its centerpiece virtual data processing, also known as data virtualization or data federation. Leading analyst firms are pitching in with their definitions of this emerging architecture, calling it “information fabric” or a “logical data warehouse.” The purpose of this approach is to employ powerful new technologies and practices that can serve up more complete data views of remote sources faster, without having to run through slower ETL processes.
You will learn:
Individual, Student, & Team memberships available.