On Demand
A logical data warehouse is an architectural layer that sits atop the usual data warehouse (DW) store of persisted data. The logical layer provides (among other things) several mechanisms for viewing data in the warehouse store and elsewhere across an enterprise without relocating and transforming data ahead of view time. These views also serve as interfaces into disparate data and its sources. In other words, the logical data warehouse complements the traditional core warehouse (and its primary function of a priori data aggregation, transformation, and persistence) with functions that fetch and transform data, in real time (or near to it), thereby instantiating non-persisted data structures, as needed.
Philip Russom, Ph.D.
Sponsored by
SAP, Co-Sponsored by Intel
Self-service Business intelligence software is bringing analysts and business users together and driving the fundamental cultural shift making organizations truly data-driven. Broader access to reliable and curated data can improve business performance with top- and bottom-line impact. And more businesses are seeing this benefit as interest in self-serve BI tools grows, according to TDWI research.
Fern Halper, Ph.D.
Sponsored by
Looker
The three-decade-old enterprise data warehouse is evolving into an enhanced data warehouse architecture where Hadoop acts as a supporting platform for traditional data warehouse activities. The challenge with this enhanced data warehouse approach is how to store and access data transparently regardless of its location and how it is managed. This presentation explores why organizations are adding Hadoop to the traditional data warehouse, presents use cases for such an environment, and takes a detailed look at why organizations need a common and transparent interface to both traditional relational and Hadoop data management systems.
Colin White
Sponsored by
TDWI and IBM Content
Many organizations are launching their first predictive analytics project and are not sure how or where to begin. With a great deal of hype around big data and data science, most companies understandably seek a data scientist with an extensive resume as their initial step.
Keith McCormick
Sponsored by
SAP
The field of business analytics is undergoing massive change as vendors introduce disruptive technologies such as analytic appliances, non-relational systems, cloud computing, and big data analytics. What vendors often forget in the rush to market these new technologies is that many organizations are struggling with performance demands, governance issues, and satisfying user requirements using their existing data warehouse and analytics environment, and may not have the resources to take advantage of new industry developments.
Colin White
Sponsored by
Teradata
TDWI Research indicates that more companies are considering moving to public or hybrid cloud offerings for some or all of their analytics. Whether for customer, supply chain, or financial metrics, such organizations often collect large amounts of data—especially public cloud-generated data—and are interested in analyzing that information in the cloud.
Fern Halper, Ph.D.
Sponsored by
Teradata
No matter the vintage or sophistication of an organization’s data warehouse (DW) and the environment around it, the DW probably needs one or more upgrades and enhancements to address new requirements for advanced analytics, real time, streaming data, machine data, big data, and unstructured data. These and related issues are addressed in a new Checklist Report by TDWI’s Philip Russom called Tips for Modernizing a Data Warehouse. That report was sponsored by vendor firms Cloudera, Impetus, MapR Technologies, and Teradata Corporation.
Philip Russom, Ph.D.
Sponsored by
Cloudera, Impetus Technologies, MapR, Teradata