By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Blog

Wayne Eckerson

Reflections on the practice of business intelligence.

Wayne Eckersonby Wayne Eckerson Eckerson is the author of many in-depth reports, a columnist for several business and technology magazines, and a noted speaker, blogger, and the author of the best-selling book Performance Dashboards: Measuring, Monitoring, and Managing Your Business (John Wiley & Sons, 2005) and TDWI’s BI Maturity Model.


Crossing the Chasm II - Departmental to Enterprise BI

In a prior blog, I discussed strategies for crossing the “Chasm” in TDWI’s five-stage BI Maturity Model. The Chasm represents challenges that afflict later-stage BI programs. In my prior blog, I showed how later-stage BI programs must justify existence based on strategic value rather than cost savings efficiency, which is a hallmark of early-stage BI programs.

But perhaps a more serious challenge facing BI programs that want to cross the Chasm is migrating from a departmental BI solution to one with an enterprise scope.

Departmental Solutions. In my experience, some of the most successful BI solutions in terms of user satisfaction are developed at the department level. One reason is because the scope of the project is manageable. A department is small enough so that the technical developers might sit side by side with the business analysts and subject matter experts. They may see themselves as belonging to a single team and may even report to the same boss. The BI project is smaller and probably draws from just one or two sources that are used regularly and well known by people in the department. Moreover, hammering out semantics is straightforward: everyone uses the same language and rules to define things because they all live and work in the same “departmental bubble.”

Chasm Challenges

Politics. Conversely, developing an enterprise BI solution or migrating a successful departmental one to the enterprise level is fraught with peril. The politics alone can torpedo even the most promising applications. The departmental “owners” are afraid corporate IT will suck the lifeblood out of their project by over-architecting the solution. Corporate sponsors jockey for position to benefit from the new application but are reluctant to pony up hard dollars to lay the infrastructure for an enterprise application that benefits the entire company, not just them.

Semantics. In addition, semantics are a nightmare to resolve in an enterprise environment. What sales calls a customer is often different than how finance or marketing defines a customer. Ditto for sale, margin, and any other common term used across the enterprise. In fact, it’s always the most common terms that are the hardest to pin down when crossing departmental boundaries.

Scope. Finally, scope is an issue. Enterprise applications often try to tackle too much and get stalled in the process. The number of data sources rises, the size of the team increases, and the number of meetings, architectural reviews, and sign offs mount inexorably. The team and its processes eventually get in the way of progress. Simply, the BI program has become a bottleneck.

Hybrid Organizational Model

To cross the chasm, BI programs need to maintain the agility of departmental scale efforts with the economies of scale and integration delivered in enterprise solutions. To do this teams must adopt a hybrid organizational model, in which the central or corporate BI team manages the repository of shared data, a common semantic model, and a set of documented standards and best practices for delivering BI solutions.

Departmental IT. The corporate team then works closely with departmental IT staff responsible for developing BI solutions. The departmental developers leverage the enterprise data warehouse, common semantic model, and ETL and BI tools to create subject-specific data marts and standard reports tailored to departmental users. The corporate team oversees their development, ensuring adherence to standards where possible.

Super Users. If no IT staff exists in the department, the BI team cultivates “super users”—technically savvy business users in each department—to be their liaison or “feet on the ground” there. The corporate BI team creates subject-specific data marts and standard reports for each department using input from the Super Users. It then empowers the Super Users to create ad hoc reports around the boundaries of the standard reports but not overlapping them.

BICC. A fully-functioning BI Competency Center adopts this hybrid model where the corporate BI staff work cooperatively with departmental IT professionals and super users The glue that holds this hybrid model together are the standards and best practices that empower departmental experts to build solutions quickly that leverage the enterprise resources.

This approach ensures both agility and alignment, a rare combination seen in most BI programs. The fear of giving control back to departments and proliferating spreadmarts is why many companies are stuck in the chasm. But unless the corporate BI team cedes development work via well documented standards and practices, it becomes a bottleneck to development and causes spreadmarts to grow anyway.

Posted by Wayne Eckerson on November 16, 20090 comments


Operational BI: To DW or Not?

Operational business intelligence (BI) means many things to many people. But the nub is that it delivers information to decision makers in near real time, usually within seconds, minutes, or hours. The purpose is empower front-line workers and managers with timely information so they can work proactively to improve performance.

Key Architectural Decision. This sounds easy but it’s hard to do. Low latency or operational BI systems have a lot of moving parts and there is not much time to recover from errors, especially in high-volume environments. The key decision you need to make when architecting a low-latency system is whether to use the data warehouse (DW) or not. The ramifications of this decision are significant.

On one hand, the DW will ensure the quality of low-latency data; but doing so may disrupt existing processes, add undue complexity, and adversely impact performance. On the other hand, creating a stand-alone operational BI system may be simpler and provide tailored functionality and higher performance, but potentially creates redundant copies of data that compromise data consistency and quality.

So take your pick: either add complexity by rearchitecting the DW or undermine data consistency by deploying a separate operational BI system. It’s kind of a Faustian bargain, and neither option is quick or cheap.

Within the DW

If you choose to deliver low-latency data within your existing DW, you have three options:

1. Mini Batch. One option is simply to accelerate ETL jobs by running them more frequently. If your DW supports mixed workloads (e.g. simultaneous queries and updates), this approach allows you to run the DW 24x7. Many start by loading the DW hourly and then move to 15-minute loads, if needed. Of course, your operational systems may not be designed to support continuous data extracts, so this is a consideration. Many companies start with this option since it uses existing processes and tools, but just runs them faster.

2. Change Data Capture. Another option is to apply change data capture (CDC) and replication tools which extract new records from system logs and move them in real-time to an ETL tool, staging table, flat file, or message queue so they can be loaded into the DW. This approach minimizes the impact on both your operational systems and DW since you are only updating records that have changed instead of wiping the slate clean with each load. Some companies combine both mini-batch and CDC to streamline processes even further.

3. Trickle Feed. A final option is to trickle feed records into the DW directly from an enterprise service bus (ESB), if your company has one. Here, the DW subscribes to selected events which flow through a staging area and into the DW. Most ETL vendors sell specialized ESB connectors to trickle feed data or you can program a custom interface. This is the most complex of the three approaches since there is no time to recover from a failure, but it provides the most up-to-date data possible.

Working Outside the DW

If your existing DW doesn’t lend itself to operational BI for architectural, political, or philosophical reasons, then you need to consider building or buying a complementary low-latency decision engine. There are three options here: 1) data federation 2) operational data stores 3) event-driven analytic engines.

1. Data federation. Data federation tools query and join data from multiple source systems on the fly. These tools create a virtual data mart that can combine historical and real-time data without the expense of creating a real-time DW infrastructure. Data federation tools are ideal when the number of data sources, volume of data, and complexity of queries are low.

2. ODS. Companies often use operational data stores (ODS) when they want to create operational reports that combine data from multiple systems and don’t want users to query source systems directly. An ODS extracts data from each source system in a timely fashion to create a repository of lightly integrated, current transaction data. To avoid creating redundant ETL routines and duplicate copies of data, many organizations load their DW from the ODS.

3. Event-driven engines. Event-driven analytic engines apply analytics to event data from an ESB as well as static data in a DW and other applications. The engine filters events, applies calculations and rules in memory, and triggers alerts when thresholds have been exceeded. Although tailored to meet high-volume, real-time requirements, these systems can also support general purpose BI applications.

In summary, you can architect operational BI systems in multiple ways. The key decision is whether to support operational BI inside or outside the DW. Operational BI within a DW maintains a single version of truth and ensures high quality data. But not all organizations can afford to rearchitect a DW for low latency data and must look to alternatives.

Posted by Wayne Eckerson on November 6, 20090 comments


MDM Lessons Learned

Master data management (MDM) enables organizations to maintain a single, clean, consistent set of reference data about common business entities (e.g. customers, products, accounts, employees, partners, etc.) that can be used by any individual or application that requires it. In many respects, MDM applies the same principles and techniques that apply to data warehousing—clean, accurate, authoritative data.

Not surprisingly, many data warehousing (DW) professionals have taken the lead in helping their organizations implement MDM solutions. Yet, even grizzled DW veterans pose fundamental questions about how to get started and succeed in this new arena. Here are answers to the seven most common questions:

1. What’s the best place to start with MDM? People want to know whether it’s best to start with customer, product, or account data, or whether the finance, service, or marketing department is most receptive to MDM. The actual starting place is determined by your organization and the amount of pain that different groups or departments feel due to lack of conformed master data. The only surefire advice is to start small and work incrementally to deliver an enterprise solution.

2. How do you fund MDM? Few people have succeeded in funding stand-alone MDM projects, especially if their company has recently funded data warehousing, data quality, and CRM initiatives. Executives invariably ask, “Weren’t those initiatives supposed to address this?” Replying that MDM makes those initiatives more efficient and effective just doesn’t cut it. The best strategy is to bake MDM projects into the infrastructure requirements for new strategic initiatives.

3. How do you architect an MDM solution? The right architecture depends on your existing infrastructure, what you’re trying to accomplish, and the scope and type of reference data you need to manage. A classic MDM hub is essentially a data reconciliation engine that can feed harmonized master data to a range of systems—including the data warehouse. MDM hubs come in all shapes and sizes: on one extreme, a hub serves as the only source of master data for all applications; on the other, it simply maintains keys to equivalent records in every application. Most MDM solutions fall somewhere in the middle.

4. What’s the role of the data warehouse in MDM? There is no reason you can’t designate a single application to serve as the master copy. For example, you could designate the data warehouse as the master for customer data or an Oracle Financials application as the master for the chart of accounts. These approaches are attractive because they reuse existing models, data, and infrastructure, but may not be suitable in all situations. For instance, you may want an MDM solution that supports dynamic bidirectional updates of master data in both the hub and operational applications. This requires a dynamic matching engine, a real-time data warehouse, and Web services interfaces to integrate both ends of the transaction.

5. What organizational pitfalls will I encounter? Managing the expectations of business and IT stakeholders is nothing less than a make-or-break proposition. “Change management can derail an MDM project,” says one chief technology officer at a major software manufacturer that implemented a global MDM project. “When you change the data that end users have become accustomed to receiving, it can cause significant angst. You have to anticipate this, implement a transition plan, and prepare the users.” In addition, don’t underestimate the need to educate IT professionals about the need for MDM and the new tools and techniques required to implement it.

6. What technical pitfalls will I encounter? First of all, MDM requires a panoply of tools and technologies, some of which may already exist in your organization. These include database management systems, data integration tools, data matching and quality tools, rules-based systems, reporting tools, scheduling, and workflow management. Buying a packaged solution alleviates the need to integrate these tools. But if you already have the tools that exist in a package, negotiate a steep discount. Early MDM adopters say the biggest challenges are underestimating the time and talent required to define and document MDM requirements, analyze source data, maintain high-performance Web services interfaces, and fine tune matching algorithms to avoid under- or over-matching.

7. How do I manage a successful MDM implementation? To succeed, MDM requires business managers to take responsibility for defining master data and maintaining its integrity. This involves assigning business executives to stewardship roles in which they drive consensus about data definitions and rules and oversee processes for changing, managing, auditing, and certifying master data. Good data governance may or may not involve steering committees and meetings, but it always involves establishing clear policies and processes and holds business people accountable for the results.

MDM is a major undertaking and there is much to learn to be successful. But hopefully the answers to these seven questions will get you moving in the right direction.

Posted by Wayne Eckerson on November 6, 20090 comments


Teradata the Wise

One sign of wisdom is that you become less dogmatic about things. Years of experience show you (often the hard way) that there is no one right way to think or act or be. People, cultures, tastes, and beliefs come in all shapes and sizes. And so do data warehouses.

It’s good to see Teradata acknowledge this after years of preaching the enterprise data warehouse uber alles. President and CEO Mike Koehler said in his keynote in this week at Teradata Partners conference, “We like data marts.” And he could have added “operational data stores,” “appliances,” “the cloud” and so on. Rather than pitching a single platform and architectural approach, Teradata now offers an “ecosystem” of products (i.e. its appliances), enabling customers to pick and choose the offerings that best meet their needs and budget. Kudos!

Empowering Analysts

Teradata Agile Analytics for the Cloud. I was particularly captivated by the newly announced Teradata Agile Analytics for the Cloud, which enables individual business analysts to automatically carve out temporary sandboxes within a central data warehouse. Business analysts can upload data to the sandbox and run queries against their data and data in the warehouse they have permission to access. This empowers business analysts to explore data without tempting them to create renegade data marts under their desks.

Like any good software vendor, Teradata is following the lead of some of its more advanced customers, such as eBay, which have been doing this for years. Oliver Ratzeberger, eBay’s senior director of architecture and operations, discussed his use of these types of sandboxes at last year’s TDWI Executive Summit and again this week at Teradata Partners. eBay has now moved beyond what Teradata announced (which is not yet officially available) and can now dynamically reallocate resources to departmental data marts (not just personal data marts) based on usage. eBay calls these virtual data marts. Hopefully, Teradata will add this functionality in the near future, and then we’ll truly know it likes data marts!

I think Teradata’s marketing team took some liberties with the product name by adding the word “cloud” to it. Although perhaps technically correct, the name is confusing to the general data warehousing community. I’d rather see them call it something more straightforward, like Teradata’s Personal Sandbox Service.

Posted by Wayne Eckerson on October 22, 20090 comments


Crossing The Chasm Part 1: Delivering Strategic Value

TDWI’s Maturity Model is represented by a bell curve spanning five stages of maturity. The curve, which represents the percentage of companies at each stage, is broken in two spots. The first is the Gulf and the second is the Chasm. (See figure 1.)

The Gulf and Chasm represent the series of obstacles and challenges that early- and later-stage programs encounter during their BI journey, respectively. Companies in the Gulf struggle with sponsorship, funding, data quality, project scope, and spreadmarts. Companies in the Chasm struggle with the politics, logistics, and dynamics of delivering an enterprise BI environment. (An enterprise environment may span the entire organization, a business unit, or profit-loss center.)

While the Gulf is difficult, the Chasm is downright perilous. Many BI programs fall into the Chasm and are never heard from again. During the next several weeks, I’ll write a series of blog posts designed to help BI professionals better understand the hazards of the Chasm and how to overcome them.

    

Figure 1.

From Cost-Savings to Strategic Value

The first major challenge that companies face in the Chasm is re-justifying their existence. By now, the BI team has squeezed all the cost-efficiencies out of the reporting process and must justify itself by the strategic value it offers the organization. Executives funded early data warehousing projects in part because of the projected ROI from consolidating legacy reporting systems and freeing high-priced analysts from having to manually create standard weekly or monthly reports.

Now, the BI program must do more than reduce costs and streamline IT processes. It needs to become a mission-critical resource that executives see as critical to the company’s growth. This is especially true if BI represents a sizable portion of the IT budget and employs more than a handful of full-time staffers. With a six- or seven-figure budget, the BI program has a huge bulls-eye on its back during budget season. To address the question, “What has BI done for us lately?” the BI team must have a ready and compelling answer.

Astute BI directors craft a vision for BI in business language and peddle it constantly. More than pitching proverbial “single version of truth,” they discuss how BI is critical to understanding the profitability of every customer or every product and how this information drives critical decisions about how the company invests its time, resources, and money. They talk about how BI is critical to achieving a 360-degree view of customers or suppliers and how that information has increased revenues and decreased procurements costs, respectively.

Savvy BI directors also discuss how BI is bridging the barriers between departments. For example, they talk about how finance directors and operations managers can speak the same language because general ledger data is aligned with product sales at a detailed level. They also show the company’s performance has improved since the BI team started displaying metrics with detailed, supporting data within easy-to-use dashboards and scorecards.

Talking the Talk

Of course, BI managers won’t have much to talk about if they haven’t done these or other strategic initiatives. But at least they can relay the vision they are working on, and bank on goodwill until they have tangible results to show.

But even “talking the talk” can be a stretch for some BI managers who grew up in IT with a technological perspective of the business. Thus, the first key to crossing the chasm is understanding the business and developing a strong rapport with business executives and managers by framing BI in their context.

Stay tuned for a description of the other major challenges facing companies in the Chasm in future posts.

Posted by Wayne Eckerson on October 19, 20090 comments


Making Peace with Spreadsheet "Jockeys"

I just finished writing the second draft of a report titled, “Transforming Finance: How CFOs can Use Business Intelligence to Turn Finance from Bookkeepers to Strategic Advisors.” In the report, I interviewed several so-called “spreadsheet jockeys” who dominate the financial alcoves to find out why it is so hard to get them to use BI tools instead of spreadsheets where appropriate. (The report will be published in January, 2010.)

I was lucky enough to interview a lady who used to be a financial analyst and is now in charge of business intelligence. Her perspective is revealing on why the BI team failed to meet her needs and how she is trying to change the situation.

“When I was in the planning and forecasting role within finance, the BI team would generate a report that gave me 90% of what I need. And while they felt they had addressed my needs, this was almost useless to me as I would need to rerun my manual queries to get the 10% of data that was missing. The BI team needs to understand the entire workflow. They need to ask why a financial analyst needs a report and what they are trying to accomplish.”

Shivani Thakur, director of e-payments and business intelligence at Points.com

I was also struck by Shivani’s rationale for using Excel rather than BI tools when she was in the finance department.

“I have been one of those so-called "spreadsheet jockeys" not out of an affinity for Excel but out of necessity. Finance is under tight time pressures to crunch the numbers and provide recommendations to the business. Major decisions are made on these recommendations (e.g. whether to take or reject business or do layoffs.) The finance team cannot afford to be wrong or wait for the data to be in the right format.”

“The reason that Excel is used is because this tool is the only place where you can download reports that meet some (not all) of your needs and then adjust and/or link (i.e. integrate) the numbers accordingly. I have also worked at organizations where a new BI tool provided data in a timely and appropriate format that financial analysts could slice and dice. However, the output had to be in Excel since again the tool did not provide all the data, and it was necessary to manually link to other sources of data.”

“Yet, spreadsheet jockeys struggle with version control, linking issues, and other errors when manual intervention is required and we end up with about 15 to 20 spreadsheets all linked together. This is not a desired state for anyone in finance. But currently other alternatives are not available.”

Now as a director of BI, Shivani is frustrated by her inability to meet the needs of financial analysts like her.

“One of the things I struggle with most is how long it now takes me to respond to business requests. As a "finance geek" I was able to slice and dice numbers in a number of days by downloading reports in Excel, linking them manually and doing whatever manipulations I needed. The BI team and environment that I have adopted is not as nimble. Integrating data into our systems and then ensuring the maintenance and quality of that data is taking months to complete. It is only after this that a simple report can be built.”

Shivani’s company is in the early stages of BI, and I’ve counseled her to keep plugging along until something happens that awakens the top executives to the importance of BI to the company and to the need to invest aggressively in this area. Usually, this is a merger or acquisition, a new executive or CIO, or new competition or regulations. Hopefully, this “change event” will happen soon before her company loses a person who understands both sides of the business-IT equation and can truly make a difference.

Posted by Wayne Eckerson on October 14, 20090 comments


Five Steps to Self Service Nirvana

There’s a lot of talk these days about the importance of self-service business intelligence (BI), but little discussion about the downsides. Unfortunately, I’ve seen self-service BI initiatives go completely awry.

Typically, a small percentage of power users employ the new-fangled tools to create tens of thousands of reports—many of which contain conflicting or inaccurate data—and this makes it harder for casual users to find the right report or trust its contents. Ironically, many turn to IT to create a custom report, which expands the report backlog that self-service BI was supposed to eliminate. In addition, many power users use the self-service tools to query large data sets that they export to Excel for analysis. These runaway queries erode query performance significantly, making the BI environment even less inviting for casual users.

So despite its allure, self-service BI can cause overall BI usage to plummet, report backlogs to grow, query performance to diminish, and information inconsistency to increase. To avoid such unintended consequences, here are five ways to implement self-service BI properly:

1. Deploy the right type of self-service BI. Power users need ad hoc report creation while casual users need ad hoc report navigation. There is a big difference. Powers users need to create reports from scratch, often accessing data from multiple locations. Casual users, on the other hand, simply want a report or dashboard that contains their top ten performance metrics and makes it easy to drill down into details, if desired.

2. Don’t abdicate responsibility for creating standard reports. To ensure that business units exploit the full value of the data warehousing resource, BI teams should take an active role in creating standard reports or dashboards for each group. This requires BI teams to get more engaged with the business, not less, which is usually what happens when an organization deploys self-service BI tools. It is not enough to create the data warehouse; if you want widespread adoption, you need to deliver the end-to-end solution.

3. Create a network of super users. Super users are technically-inclined power users in each department who function as extensions to the corporate BI team. Their job is to create ad hoc reports around the edges of the standard reports, suggest enhancements to the standard reports, and work with the BI Competency Center to define standards, select tools, and create a BI roadmap. Super users are the eyes and ears of the corporate BI team within each business group and critical to its success.

4. Liberate and manage power users. Power users can make or break a BI program. To get them on your side, give power users access to the best tools, unlimited access to data, and a sandbox within the data warehouse in which to conduct their ad hoc explorations. To ensure that they don’t proliferate spreadmarts, limit their ability to publish reports back to the server and establish a review board of power users (i.e. their peers) to examine new requests for published reports. This two-pronged approach will both empower power users to generate critical insights while minimizing their ability to foment report chaos

5. Educate executives to use standard, certified reports. For self-service BI to really work, executives need to make a companywide declaration and commitment to use standard, certified reports or dashboards when making decisions. Of course, there will always be exceptions, like when standard reports don’t contain the data needed to address issues required for a decision. But a corporate edict backed up by action makes all the difference.

Following these five steps will help your BI program achieve the holy grail of self-service BI.

Posted by Wayne Eckerson on September 28, 20090 comments


Humpty Dumpty and the CEO

There is a dirty, little secret about data warehouses: we wouldn’t need them if top executives ran their organizations properly.

A data warehouse (DW) reflects the organization—the more fractured and disintegrated the organization, the harder it is to create a robust, highly functional data warehouse. These reporting repositories really are tools to reintegrate a fractured enterprise and provide a holistic and consistent view of data where none exists.

Most organizations are like Humpty Dumpty teetering and tottering on top of a big wall. With the slightest gust of wind, Humpty crashes and breaks into dozens of pieces. And DW teams are “all the king’s horses and all the king’s men” who are charged with putting Humpty Dumpty back together again.

Today, most companies have fragmented into dozens or hundreds of largely unconnected business units, departments, and workgroups, each with their own strategies, policies, processes, systems, IT staffs, and data. It’s the job of the CEO to bring order to this chaos. If the CEO provides the business with a clear, coherent strategy, integrated processes and systems, and standard definitions and rules governing those processes, then there is almost no need for a data warehouse.

The good news is that many DW teams do succeed in gluing their companies back together, at least for awhile, until the next merger, acquisition, or other upheaval blasts everything apart. But while it lasts, these unsung heroes should be congratulated, not outsourced, offshored, or reassigned to the IT-equivalent of a Siberian gulag.

The moral of the story is this: don’t blame the DW team if it can’t deliver a successful data warehouse; blame the CEO. The data warehouse is merely a messenger, a reflection of the state of organizational dysfunction.

Editor's note: this article originally appeared in the Washington Post.

Posted by Wayne Eckerson on September 23, 20090 comments


From Data Warehousing to Performance Management

Next year marks the 15th anniversary of TDWI, an association of data warehousing and business intelligence (BI) professionals that has grown nearly as fast as the industry it serves, which now tops $9 billion according to Forrester Research.

In 1995, when data warehousing was just another emerging information technology, most people—including some at TDWI—thought it was just another tech fad that would fade away in a few short years like others before it (e.g. artificial intelligence, computer-assisted software engineering, object-relational databases.) But data warehousing’s light never dimmed, and it has evolved rapidly to become an indispensable management tool in well-run businesses. (See figure 1.)

Figure 1. As data warehousing has morphed into business intelligence and performance management, its purpose has evolved from historical reporting to self-service analysis to actionable insights. At each step along the way, its business value has increased.

In the beginning…

In the early days, data warehousing served a huge pent up need within organizations for a single version of corporate truth for strategic and tactical decision making and planning. It also provided a much needed, dedicated repository for reporting and analysis that wouldn’t interfere with core business systems, such as order entry and fulfillment.

What few people understood then was that data warehousing was the perfect, analytical complement to the transaction systems that dominated the business landscape. While transaction systems were great at “getting data in,” data warehouses were great at “getting data out” and doing something productive with it, like analyzing historical trends to optimize processes, monitoring and managing performance, and predicting the future. Transaction systems could not (and still can’t) do these vitally important tasks.

Phase Two: Business Intelligence

As it turns out, data warehousing was just the beginning. A data warehouse is a repository of data that is structured to conform to the way business users think and how they want to ask questions of data. (“I’d like to see sales by product, region and month for the past three years.”)

But without good tools to access, manipulate, and analyze data, users can’t drive much value from the data warehouse. So organizations began investing in easy-to-use reporting and analysis tools that would empower business users to ask questions of the data warehouse in business terms and get answers back right away without having to ask the IT department to create a custom report. Empowering business users to drive insight from data using self-service reporting and analysis tools became known as “business intelligence.”

Phase Three: Performance Management

Today, organizations value insights but they want results. Having an “intelligent” business (via business intelligence) doesn’t do much good if the insights don’t help it achieve its strategic objectives, such as growing revenues or increasing profits. In other words, organizations want to equip users with proactive information to optimize performance and achieve goals.

Accordingly, business intelligence is now morphing into performance management where organizations harness information to improve manageability, accountability, and productivity. The vehicles of choice here are dashboards and scorecards that graphically depict performance versus plan for companies, divisions, departments, workgroups and even individuals. In some cases, the graphical “key performance indicators” are updated hourly so employees have the most timely and accurate information with which to optimize the processes for which they are responsible.

Organizations have discovered that publishing performance among peer groups engenders friendly competition that turbocharges productivity. But more powerfully, these tools empower individuals and groups to work proactively to fix problems and exploit opportunities before it is too late. In this way, performance management is actionable business intelligence built on a single version of truth delivered by a data warehousing environment.

The Future

As Timbuk3 once sung, “The future looks so bright I have to wear shades.” The same could be said of business intelligence as it increasingly becomes a powerful tool for business executives to measure, monitor, and manage the health of their organizations and keep them on track towards achieving strategic goals.

Posted by Wayne Eckerson on September 18, 20090 comments


Cognos Two Years Later

IBM Cognos laid out a roadmap for the future and provided a deep dive into the elements of its next major release (due Q2, 2010) in its fourth annual Industry Analyst Summit in Ottawa this week.

Due to confidentiality agreements, there is not much I can discuss publicly about future releases. However, it was refreshing to know that despite being swallowed up by IBM, the original Cognos executive team is still intact, and they seem as engaged and excited as ever. The next release contains many breakthrough features that will make the IBM Cognos offerings extremely compelling.

IBM executives have made a sizable strategic commitment to business analytics and optimization as a major growth area for the $100 billion company. As a consequence, IBM Cognos is playing a pivotal role at IBM, and some of its executives have been tapped to lead the charge corporatewide. To align with the strategy, many IBM business units have been eager to partner with the Cognos team, making it like the proverbial kid in a candy store, able to pick and choose its best go-to-market opportunities. For example, IBM Cognos played a key part in IBM’s recent SmartAnalytics announcement. (See “IBM Rediscovers Itself.”)

With IBM opening many new channels and markets and expanding Cognos’ development and marketing resources, IBM Cognos is experiencing a financial uplift. Rob Ashe, IBM’s general manager of business intelligence and performance management, announced by video that IBM Cognos has grown license revenues by 30% this past year.

IBM Cognos Express

While at the event, IBM Cognos announced its new mid-market product, IBM Cognos Express. Although it’s lagged behind competitors in delivering a product in this space, IBM Cognos may end up the winner. Instead of just repackaging its enterprise product with a different license model, IBM Cognos has thought long and hard about what mid-market customers need and subsequently created an entirely new end-to-end product designed suit those requirements. IBM Cognos Express borrows IBM and Cognos technologies to deliver a product that is easy to buy, simple to install, easy to use, and easy to maintain.

IBM Cognos Express offers integrated modules for query, reporting, analysis, visualization, dashboarding, and planning. Once installed, the product essentially creates a local data mart (via TM1) behind the scenes. The product supports a maximum of 100 users running a single server. And here’s the one downside: if users outgrow the IBM Cognos Express solution, they can either purchase another Express server or buy and install IBM Cognos 8 BI and migrate their data, models, and reports.

Pricing starts at $12,500, with fees for users, administrators, and data connectors. What’s most interesting is that prospects can download and try the product for 30 days and even finance the purchase from IBM Global Financing. On the surface, this product should resonate well with the mid-market. It’s good to see a company that tries to understand its customers before shoving products down their throats!

Posted by Wayne Eckerson on September 17, 20090 comments