TDWI Checklist Reports
TDWI Checklist Reports provide an overview of success factors for a specific project in business intelligence, data warehousing, or a related data management discipline. Companies may use this overview to get organized before beginning a project or to identify goals and areas of improvement for current projects.
May 31, 2011
This TDWI Checklist Report helps you determine if your data integration environment has performance issues. It describes alternative solutions and includes the pros and cons for each.
May 10, 2011
This TDWI Checklist Report makes a case for private database clouds, especially as a platform for consolidated BI and DW applications.
October 29, 2010
If your organization is among the vast majority of businesses looking to implement real-time data warehousing or data integration functions, download this report for a practical look at some critical questions.
August 16, 2010
This TDWI Checklist Report explains how to provide justification for time and money invested in metadata management solutions.
August 10, 2010
This TDWI Checklist Report is designed to provide a basic set of guidelines for implementing big data analytics.
June 4, 2010
This TDWI Checklist Report explains the challenges and benefits of improving product data quality.
May 28, 2010
The guidelines listed in this TDWI Checklist Report can help you achieve more modern, high-value, diverse, independent, well-designed, far-reaching, green, collaborative, and well-governed uses of data integration tools and techniques.
May 14, 2010
This TDWI Checklist Report provides recommendations for improving the quality of operational data, which in turn contributes to an organization’s drive toward operational excellence.
November 1, 2009
Data federation is an important tool in today’s data integration portfolio. Data and application architects use the middleware to query and join data from multiple sources on the fly and deliver the results to data-hungry decision makers. It makes a lot of sense to use data federation tools when it takes too long or costs too much to create a persistent store of consolidated data, such as a data warehouse or data mart.