TDWI FlashPoint: Exclusive Excerpt for Datasource Consulting Subscribers
We'd like to extend a special welcome to Datasource Consulting's newsletter subscribers! Below, you'll find an excerpt from "2012 Industry Review from the Trenches" by Steve Dine and David Crolene. This article was published in the December 2012 issue of TDWI FlashPoint.
Distributed monthly via e-mail to thousands of BI/DW professionals, TDWI FlashPoint features unique how-to articles, key findings from TDWI Research, excerpts from the latest Premium Member publications, and tips on building and managing BI/DW teams. Written by TDWI Premium Members, fellows, and instructors, the focus is on timely BI and DW issues.
If you are interested in reading the full article, we invite you to become a TDWI Premium Member. TDWI Premium Membership comes with a wide range of benefits, including a comprehensive selection of industry research, news, and information; access to all of TDWI's current and archived research and publications in password-protected areas of the TDWI website; and discounts on TDWI World Conferences, TDWI Seminars, and Certified Business Intelligence Professional (CBIP) exams.
Thank you for considering Premium Membership with TDWI! Please send us your questions and feedback.
2012 Industry Review from the Trenches
By Steve Dine and David Crolene, Datasource Consulting
Each year, we reflect upon the business intelligence (BI) and data integration (DI) industry and provide a review of the noteworthy trends that we encounter in the trenches. Our review emanates from five sources: our customers, industry conferences, articles, social media, and BI software vendors. This year has proved to be an interesting one on many fronts. Here are our observations for 2012 and our expectations about 2013.
1. BI programs matured
As we reported in 2011, larger numbers of existing BI programs are continuing to mature beyond managed reports, ad hoc queries, dashboards, and OLAP. Companies are increasingly looking to derive more value from their data via technologies and capabilities such as:
- Text and social analytics
- Advanced data visualization
- Predictive and descriptive analytics
- Geospatial analysis
These capabilities all require significant computing power, and consequently we have seen a corresponding rise in technologies such as analytic databases and analytic applications. We see this trend continuing into 2013 and beyond.
2. Greater focus was placed on operational BI
Over the past few years, we have observed an increased focus on operational BI. As corporate BI programs mature, this is a natural evolution. There is considerable value in using BI to support and enhance operations within a company, but companies that are successful doing this must realize that operational BI is a different class of data. Operational BI generally requires lower data latency, higher data selectivity, and a larger amount of query concurrency than traditional analytic workloads. These factors often require a different architecture than what was designed for the data warehouse.
Furthermore, support of the system may need to be executed differently. If a load on a traditional data warehouse fails, it is often acceptable to address it within hours, not minutes. For operational BI, a 24/7 support model is more often required because load failures may immediately impact the bottom line.
We see the trend toward operational BI and lower latency analytics continuing as organizations broaden their focus from enterprise data warehousing to enterprise data management.
3. BI wanted to be agile
We've always recognized the high cost of, and long lead times for, implementing BI, but customers have finally said “enough” and BI teams have to listen. New software-as-a service (SaaS) BI offerings and departmental solutions enable businesses to move forward without IT, putting even more pressure on BI programs to deliver results faster. Businesses are looking for new ways to implement BI and are finding that many agile practices (smaller, focused iterations; daily scrum meetings; embedded business representatives; prototyping; and integrated testing) help accelerate BI projects and bolster communication between business users and IT. Certain technologies are also helping influence this shift. Data virtualization, for example, allows a “prototype, then build” capability and doesn’t require physicalizing all the data required for analysis. However, agile was created for software development, not BI, and early adopters are learning that there are many differences. For example, the tools to automate software code testing are far more numerous and mature than for ETL mappings and data warehouses.
We expect to see BI practitioners continue to refine which agile principles are effective with BI and which ones don't translate as well. We also expect to see a rise in enabling technologies, such as desktop analytic software, data virtualization, and automated testing/data validation.
4. Momentum shifted to “in-memory”
With the rise of 64-bit architectures and the ever-decreasing cost of memory, we have recognized a shift in momentum from in-memory applications to the database (namely with data warehouse appliances). The past several years have seen ...
Interested in reading the full article? Become a Premium Member today!
About the Authors
Steve Dine is the managing partner at Datasource Consulting, LLC. He has extensive hands-on experience delivering and managing successful, highly scalable, and maintainable data integration and business intelligence solutions. Steve is a faculty member at TDWI and a judge for the TDWI Best Practices Awards. Follow Steve on Twitter: @steve_dine.
David Crolene is a partner at Datasource Consulting, LLC. With 15+ years of experience in business intelligence and data integration, David has worked for two leading BI vendors, managed data warehouses for an electronics giant, and consulted across a range of industries and technologies.