Dashboard Ho!
How a new breed of dashboards and scorecards will help usher in the convergence of BI and PM.
- By Stephen Swoyer
- October 4, 2006
To listen to some business intelligence (BI) vendors tell it, the success of dashboards and scorecards is all but assured. But TDWI director of research and services Wayne Eckerson says dashboards and scorecards will soon have much more than cachet among both executives and BI pros, partly because a new breed of such tools—the performance dashboard and the performance scorecard—will help usher in the convergence of BI and performance management (PM).
“Like long-lost siblings, these two disciplines have struggled on their own to deliver real business value and gain a permanent foothold within organizations,” Eckerson writes in a TDWI report published this summer, Deploying Dashboards and Scorecards. “But together they offer an unbeatable combination whose whole is greater than the sum of the parts.”
Eckerson’s conclusions are based on a survey of 299 dashboard and 199 scorecard adopters, 45 percent of which represent companies with annual revenues of $500 million or more.
According to Eckerson, nearly one-third (31 percent) of survey respondents are using dashboards or scorecards as their primary analytic applications. Moreover, nearly one-quarter (24 percent) are in the process of building performance dashboards or scorecards, while nearly three-quarters (74 percent) had deployed either dashboards or scorecards or were in the process of doing so. The department that’s most likely to deploy a dashboard or a scorecard is operations, where nearly 70 percent of survey respondents had dashboards and almost 80 percent had scorecards. Other than operations, popular destinations for dashboards or scorecards include finance, sales, and marketing.
If there’s confusion between dashboards and scorecards, this isn’t surprising. Hyperion Solutions Corp., for example, just announced a Webinar that proposes to explain the differences—and benefits—of both or either to prospective buyers. The salient takeaway, writes TDWI director of education Wayne Eckerson, is that dashboards and scorecards are often deployed simultaneously. He cites the example of a government agency that purchased dual-use dashboard and scorecard software several years ago. This agency currently takes advantage of both PM tools, supporting dashboard and scorecard views for 1,500 employees. The agency in question maintains more than 60 scorecards as well as numerous dashboards, Eckerson says.
That particular deployment is by no means anomalous. More than one-third (34 percent) of dashboard adopters expose scorecard capabilities in the same application, while an even greater margin—fully 50 percent—do the opposite (i.e., expose dashboard capabilities in what are otherwise nominally scorecard deployments).
This is only part of the story, however: among dashboard adopters, for example, more than half (51 percent) expose scorecarding capabilities in separate applications, while nearly three-quarters (73 percent) of scorecard adopters take the same approach with dashboards. Many scorecard adopters configure these tools to provide generic, at-a-glance insight into operational health. Dashboard adopters, on the other hand, usually expose some form of deeper analytic capability and give users more of an opportunity to customize their views.
There’s a sense in which dashboards are viewed as must-have executive accessories, and as Eckerson notes, there’s a reason for this: Many dashboard deployments are the result of top-down prodding from executive decision-makers. In nearly two-thirds (64 percent) of cases, for example, survey respondents reported that the drive-to-dashboard came from the line-of-business.
Of course, just because a PM push comes from the line-of-business doesn’t mean that IT has carte blanche—funding-wise, at least—to purchase and deploy a dashboard or scorecard (or both).
“Program managers usually don’t get much money to start, but find a way to beg, borrow, or steal development and hardware resources to deliver an initial solution in two to three months,” writes Eckerson. “Until recently, most solutions were homegrown, but the spate of inexpensive dashboard and scorecard projects [with total license fees of $50,000 or less] will change that.”
In other words, Erickson says, line-of-business customers usually won’t put up any cash until they’re able to see a real—and promising—working model. “Once the solution looks promising, the executive usually negotiates funds to turn the shoestring application into a more permanent solution by allocating full-time development staff to the project and purchasing hardware and software,” he notes. While this is perhaps understandable—line-of-business customers are notoriously stingy with their budget dollars—it’s also to some extent self-defeating: “[E]ventually these quickie projects must be re-architected and put on a more substantial data infrastructure, often at significant cost.”
So what kind of user constituencies are performance dashboards and scorecards servicing? From tiny to vast—and everything in between. The average dashboard supports 315 users, the average scorecard, 493. Once again, however, this is far from a complete picture: after all, 72 percent of dashboards and scorecards are deployed to less than 100 users. “This means that while most performance dashboards support small user bases, a minority of implementations support extremely large numbers of users,” Eckerson points out.
There’s a further wrinkle here, too, he indicates. “There is usually a direct correlation between years of deployment and number of users. That’s because once one department has a dashboard, executives in every other department want one. Performance dashboards are very contagious.”
Once a company decides to take the performance dashboard plunge, it typically moves fast, Eckerson says. More than one-third (37 percent) of all performance dashboard implementations are deployed in weeks, while the rest take months or longer. “Quick deployments can happen when there is already a consensus on what measures will go into the performance dashboard. Most industries and organizations already have a set of standard measures, which they reuse in a performance dashboard deployment.”
Quick deployments are also made possible by existing—and, moreover, mature—BI and data management practices.
Most vendors talk about pervasive BI and the potential for connecting dashboards and scorecards to a wide array (perhaps the entirety) of an organization’s data sources. In practice, however, this is rarely the case. Not surprisingly, the vast majority of dashboards and scorecards are at least connected to relational databases (approaching 90 percent, in both cases). Dashboards and scorecards are also frequently connected to mainframe data sources (with connectivity in both cases approaching 50 percent).
Ditto for Excel, and to a lesser extent, packaged applications. In the latter case, not quite 40 percent of dashboards (and just under half of scorecards) are consuming data from packaged sources. Surprisingly, connectivity to reports is less frequent, with fewer than one third of dashboards (and less than 40 percent of scorecards) exposed to report data.
About the Author
Stephen Swoyer is a technology writer with 20 years of experience. His writing has focused on business intelligence, data warehousing, and analytics for almost 15 years. Swoyer has an abiding interest in tech, but he’s particularly intrigued by the thorny people and process problems technology vendors never, ever want to talk about. You can contact him at
[email protected].