TDWI Blog

Wayne Eckerson

Reflections on the practice of business intelligence.

Wayne Eckersonby Wayne Eckerson Eckerson is the author of many in-depth reports, a columnist for several business and technology magazines, and a noted speaker, blogger, and the author of the best-selling book Performance Dashboards: Measuring, Monitoring, and Managing Your Business (John Wiley & Sons, 2005) and TDWI’s BI Maturity Model.


Get Help When Designing Dashboards

Designing dashboards is not unlike decorating a room in your house. Most homeowners design as they purchase objects to place in the room. When we buy a rug, we select the nicest rug; when we pick out wall paint, we pick the most appealing color; when we select chairs and tables, we find the most elegant ones we can afford. Although each individual selection makes sense, collectively the objects clash or compete for attention.

Smart homeowners (with enough cash) hire interior decorators who filter your tastes and preferences through principles of interior design to create a look and feel in which every element works together harmoniously and emphasizes what really matters. For example, the design might highlight an elegant antique coffee table by selecting carpets, couches, and curtains that complement its color and texture.

To optimize the design of your performance dashboard, it is important to get somebody on the team who is trained in the visual design of quantitative information displays. Although few teams can afford to hire someone full time, you may be able to hire a consultant to provide initial guidance or find someone in the marketing department with appropriate training. Ideally, the person can educate the team about basic design principles and provide feedback on initial displays.

But be careful: don’t entrust the design to someone who is a run-of-the-mill graphic artist or who is not familiar with user requirements, business processes, and corporate data. For example, a Web designer will give you a professional looking display but will probably garble the data—they might use the wrong type of chart to display data or group metrics in nonsensical ways or apply the wrong filters for different user roles. And any designer needs to take the time upfront to understand user requirements and the nature of the data that will populate the displays.

Ideally, report developers and design experts work together to create an effective series of dashboard displays, complementing their knowledge and expertise. This partnership can serve as a professional bulwark against the sometimes misguided wishes of business users. Although it’s important to listen to and incorporate user preferences, ultimately the look and feel of a dashboard should remain in the hands of design professionals. For example, most companies today entrust the design of their Web sites and marketing collateral to professional media designers who work in concert with members of the marketing team. They don’t let the CEO dictate the Web design (or shouldn’t anyway!)

There are many good books available today to help dashboard teams bone up on visual design techniques. The upcoming second edition of my book, “Performance Dashboards: Measuring, Monitoring, and Managing Your Business” has a chapter devoted to visual design techniques. Many of the concepts in the chapter are inspired by Stephen Few whose “Information Dashboard Design” book is a must read. Few and others have drawn inspiration from Edward R. Tufte, whose book, “The Visual Display of Quantitative Information,” is considered a classic in the field. Tufte has also written “Visual Explanations,” “Envisioning Information” and “Beautiful Evidence.”

Posted by Wayne Eckerson on July 27, 20100 comments


Do Your Team a Favor: Stop Acting Like IT

(Caution: This blog may contain ideas that are hazardous to your career.)

I’ve argued in previous blogs that business intelligence (BI) professionals must think more like business people and less like IT managers if they are to succeed. However, while many BI professionals have their hearts in the right place, their actions speak differently. They know what they need to do but can’t seem to extricate themselves from an IT mindset. That takes revolutionary thinking and a little bit of luck.

Radical Thinking

So, here’s a radical idea that will help you escape the cultural bonds of IT: don’t upgrade your BI software.

Now, if you gasped after reading that statement, you’re still an IT person at heart. An IT person always believes in the value of new software features and fears losing vendor support and leverage by not staying fairly current with software licenses and versions.

Conversely, the average business person sees upgrades as a waste of time and money. Most don’t care about the new functionality or appreciate the financial rationale or architectural implications. To them, the upgrade is just more “IT busywork.”

Here’s another radical idea: stick with the BI tools you have. Why spend a lot of money and time migrating to new platform when the one you have works? So what if the tools are substandard and missing features? Is it really a problem if the tools force your team to work overtime to make ends meet? Who are the tools really designed to support: you or the users?

In the end, it’s not the tools that matter, it’s how you apply them. Case in point: in high school, I played clarinet in the band. One day, I complained vociferously to the first chair, a geeky guy named Igor Kavinsky who had an expensive, wooden clarinet (which I coveted), that my cheap, plasticized version wasn’t working very well and I needed a replacement. Before I could list my specific complaints, he grabbed my clarinet, replaced the mouthpiece, and began playing.

Lo and behold, the sound that came from my clarinet was beautiful, like nothing I had ever produced! I was both flabbergasted and humiliated. It was then I realized that the problem with my clarinet not the instrument but me! Igor showed me that it’s the skill of the practitioner, not the technology, that makes all the difference.

Reality Creeps In

Igor not withstanding, if you’re a good, well-trained IT person, you probably think my prior suggestions are unrealistic, if not ludicrous. In the “real world,” you say, there is no alternative to upgrading and migrating software from time to time. These changes—although painful—improve your team’s ability to respond quickly to new business needs and avoid a maintenance nightmare. And besides, many users want the new features and tools, you insist.

And of course, you are right. You have no choice.

Yet, given the rate of technology obsolescence and vendor consolidation, your team probably spends 50% of its time upgrading and migrating software. And it spends its remaining time maintaining both new and old versions (because everyone knows that old applications never die.) All this busywork leaves your team with precious little time and resources to devise new ways to add real value to the business.

Am I wrong? Is this a good use of your organization’s precious capital? What would a business person think about the ratio of maintenance to development dollars in your budget?

Blame the Vendors. It’s easy to blame software vendors for this predicament. In their quest for perpetual growth and profits, vendors continually sunset existing products, forcing you (the hapless customer) with no choice but to upgrade or lose support and critical features. And just when you’ve fully installed their products, they merge with another company and reinvent their product line, forcing another painful migration. It’s tempting to think that these mergers and acquisitions are simply diabolical schemes by vendors to sell customers expensive, replacement products. Just ask any SAP BI customer!

Breaking the Cycle

If this describes your situation, what do you do about it? How do you stop thinking like an IT person and being an IT cuckold to software vendors?

Most BI professionals are burrowed more deeply in an IT culture than they know. Breaking free often requires a cataclysmic event that rattles their cages and creates an opening to escape. This might be a change in leadership, deregulation, a new competitor, or a new computing platform. Savvy BI managers seize such opportunities to reinvent themselves and change the rules of the game.

Clouds Coming. Lucky for you, the Cloud—or more specifically, Software as a Service (SaaS)—is one of those cataclysmic events. The Cloud has the potential to liberate you and your team from an overwrought IT culture that is mired in endless, expensive upgrades and painful product migrations, among other things.

The beauty of a multi-tenant, cloud-based solution is that you never have to upgrade software again. In a SaaS environment, the upgrades happen automatically. To business and IT people, this is magical: cool new features appear and no one had to do any work or suffer any inconvenience. SaaS also eliminates vendor lock in since you can easily change cloud vendors (as long as you maintain your data) by just pointing users to a new URL. The Cloud is a radical invention that promises to alter IT culture forever.

Getting Started. To break the cycle, start experimenting with cloud-based BI solutions. Learn how these tools work and who offers them. Use the cloud for prototypes or small, new projects. Some cloud BI vendors offer a 30-day free trial while more scalable solutions promise to get you up and running quickly. If you have a sizable data warehouse, leave your data on premise and simply point the cloud BI tools to it. Performance won’t suffer.

Unless you experiment with ways to break free from an IT culture, you never will. Seize the opportunity that the Cloud affords and others that are sure to follow. Carpe diem!

Posted by Wayne Eckerson on June 25, 20100 comments


The Evolution of Data Federation

Data federation is not a new technique. The notion of virtualizing multiple back-end data sources has been around for a long time, reemerging every decade or so with a new name and mission.

Database Gateways. In the 1980s, database vendors introduced database gateways that transparently query multiple databases on the fly, making it easier for application developers to build transaction applications in a heterogeneous database environment. Oracle and IBM still sell these types of gateways.

VDW. In the 1990s, vendors applied data federation to the nascent field of data warehousing, touting its ability to create “virtual” data warehouses. However, data warehousing purists labeled the VDW technology as “voodoo and witchcraft” and it never caught on, largely because standardizing disparate data from legacy systems was nearly impossible to do without creating a dedicated data store.

EII. By the early 2000’s, with more powerful computing resources, data federation was positioned as a general purpose data integration tool, adopting the moniker “enterprise information integration” or EII. The three-letter acronym was designed to mirror ETL—extract, transform, and load—which had become the predominant method for integrating data in data warehouses.

Data Services. In addition, the rise of Web services and services oriented architectures during the past decade gave data federation another opportunity. It got positioned as a data service, abstracting back-end data sources behind a single query interface. It is now being adopted by many companies that are implementing services oriented architectures.

Data Virtualization. Today, data federation vendors now prefer the label of data virtualization, capitalizing on the popularity of hardware virtualization in corporate data centers and the cloud. The term data virtualization reinforces the idea that data federation tools abstract databases and data systems behind a common interface.

Data Integration Toolbox

Over the years, data federation has gained a solid foothold as an important tool in any data integration toolbox. Companies use the technology in a variety of situations that require unified access to data in multiple systems via high-performance distributed queries. This includes data warehousing, reporting, dashboards, mashups, portals, master data management, SOA architectures, post- acquisition systems integration, and cloud computing.

One of the most common uses of data federation are for augmenting data warehouses with current data in operational systems. In other words, data federation enables companies to “real-time-enable” their data warehouses without rearchitecting them. Another common use case is to support “emergency” applications that need to be deployed quickly and where the organization doesn’t have time or money to build a data warehouse or data mart. Finally, data federation is often used to create operational reports that require data from multiple systems.

A decade ago there were several pureplay data federation vendors, but now the only independent is Composite Software, which is OEM’d by several BI vendors, including IBM Cognos. Other BI vendors support data federation natively, including Oracle (OBIEE) and MicroStrategy. And many data integration vendors, including Informatica and SAP have added data federation to their data integration portfolios.

Federation versus Integration

Traditionally, the pros and cons of data federation are weighed against those of data integration toolsets, especially when creating data warehouses. The question has always been “Is it better to build virtual data warehouses with federation tools or physical data marts and data warehouses with ETL tools?”

Data federation offers many advantages -- it’s a fast, flexible, low cost way to integrate diverse data sets in real time. But data integration offers benefits that data federation doesn’t: scalability, complex transformations, and data quality and data cleansing.

But what if you could combine the best of these two worlds and deliver a data integration platform that offered data federation as an integrated module, not a bolt on product? What if you could get all the advantages of both data federation and data integration in a single toolset?

If you could have your cake and eat it, too, you might be able to apply ETL and data quality transformations to real-time data obtained through federation tools. You wouldn’t have to create two separate semantic models, one for data federation and another ETL; you could use one model to represent both modalities. Basically, you would have one tool instead of two tools. This would make it easier, quicker, and cheaper to apply both data federation and data integration capabilities to any data challenge you might encounter.

This seamless combination is the goal of some data integration vendors. I recently did a Webcast with Informatica, which shipped a native data federation capability this year that runs on the same platform as its ETL tools. This is certainly a step forward for data integration teams that want a single, multipurpose environment instead of multiple, independent tools, each with their own architecture, metadata, semantic model, and functionality.

Posted by Wayne Eckerson on June 15, 20100 comments


The Key to Analytics: Ask the Right Questions

People think analytics is about getting the right answers. In truth, it’s about asking the right questions.

Analysts can find the answer to just about any question. So, the difference between a good analyst and a mediocre one is the questions they choose to ask. The best questions test long-held assumptions about what makes the business tick. The answers to these questions drive concrete changes to processes, resulting in lower costs, higher revenue, or better customer service.

Often, the obvious metrics don’t correlate with sought-after results, so it’s a waste of time focusing on them, says Ken Rudin, general manager of analytics at Zynga and a keynote speaker at TDWI’s upcoming BI Executive Summit in San Diego on August 16-18.

Challenge Assumptions

For instance, many companies evaluate the effectiveness of their Web sites by calculating the number of page hits. Although a standard Web metric, total page hits often doesn’t correlate with higher profits, revenues, registrations, or other business objectives. So, it’s important to dig deeper, to challenge assumptions rather than take them at face value. For example, a better Web metric might be the number of hits that come from referral sites (versus search engines) or time spent on the Web site or time spent on specific pages.

TDWI Example. Here’s another example closer to home. TDWI always mails conference brochures 12 weeks before an event. Why? No one really knows; that’s how it’s always been done. Ideally, we should conduct periodic experiments. Before one event, we should send a small set of brochures 11 weeks beforehand and another small set 13 weeks prior. And while we’re at it, we should test the impact of direct mail versus electronic delivery on response rates.

These types of analyses don’t take sophisticated mathematical software and expensive analysts; just time, effort, and a willingness to challenge long-held assumptions. And the results are always worth the effort; they can either validate or radically alter the way we think our business operates. Either way, the information impels us to fine-tune or restructure core business processes that can lead to better bottom-line results.

Analysts are typically bright people with strong statistical skills who are good at crunching numbers. Yet, the real intelligence required for analytics is a strong dose of common sense combined with a fearlessness to challenge long-held assumptions. “The key to analytics is not getting the right answers,” says Rudin. “It’s asking the right questions.”

Posted by Wayne Eckerson on June 10, 20100 comments


How to Organize Business Analysts

Business analysts are a key resource for creating an agile organization. These MBA- or PhD-accredited, number-crunchers can quickly unearth insights and correlations so executives can make critical decisions. Yet, one decision that executives haven’t analyzed thoroughly is the best way to organize business analysts to enhance their productivity and value.

Distributed Versus Centralized

Traditionally, executives either manage business analysts as a centralized, shared service or allow each business unit or department to hire and manage their own business analysts. Ultimately, neither a centralized or distributed approach is optimal.

Distributed Approach. In a distributed approach, a department or business unit head hires the analyst to address local needs and issues. This is ideal for the business head and departmental managers who get immediate and direct access to an analyst. And the presence of one or more analysts helps foster a culture of fact-based decision making. For example, analysts will often suggest analytical methods for testing various ideas, helping managers become accustomed to basing decisions on fact rather than gut feel alone.

However, in the distributed approach, business analysts often become a surrogate data mart for the department. They get bogged down creating low-value, ad hoc reports instead of conducting more strategic analyses. If the business analyst is highly efficient, the department head often doesn’t see the need to invest in a legitimate, enterprise decision-making infrastructure. From an analyst perspective, they often feel pigeonholed in a distributed approach. They see little room for career advancement and few opportunities to expand their knowledge in new areas. They often feel isolated and have few opportunities to exchange ideas and collaborate on projects with fellow analysts. In effect, they are “buried” in departmental silos.

Centralized Approach. In a centralized approach, business analysts are housed centrally and managed as a shared service under the control of a director of analytics, or more likely, a chief financial officer or director of information management. One benefit of this approach is that organizations can assign analysts to strategic, high priority projects rather than tactical, departmental ones, and the director can establish a strong partnership with the data warehousing and IT teams which control access to data, the fuel for business analysts. Also, by being co-located, business analysts can more easily collaborate on projects, mentor new hires, and cross-train in new disciplines. This makes the environment a more rewarding place to work for business analysts and increases their retention rate.

The downside of the centralized approach is that business analysts are a step removed from the people, processes, and data that drive the business. Without firsthand knowledge of these things, business analysts are less effective. It takes them longer to get up to speed on key issues and deliver useful insights, and they may miss various nuances that are critical for delivering a proper assessment. In short, without a close working relationship with the people they support and intimate knowledge of local processes and systems, they are running blind.

Hybrid Approach

A more optimal approach combines elements of both distributed and centralized methods. In a hybrid environment, business analysts are embedded in departments or business units but report directly to a director of analytics. This sounds easy enough, but it’s hard to do. It’s ideal when the company is geographically consolidated in one place so members of the analytics team can easily physically reconvene as a group to share ideas and discuss issues.

Zynga. For example, Zynga, an internet gaming company, uses a hybrid approach for managing its analysts. All of Zynga’s business analysts report to Ken Rudin, director of analytics for the company. However, about 75% of the analysts are embedded in business units, working side by side with product managers to enhance games and retain customers. The remainder sit with Rudin and work on strategic, cross-functional projects. (See “Proactive Analytics That Drive the Business” in my blog for more information on Zynga’s analytics initiative.) This setup helps deliver the best of both centralized and distributed approaches.

Every day, both distributed and centralized analysts come together for a quick “stand up” meeting where they share ideas and discuss issues. This helps preserve the sense of team and fosters a healthy exchange of knowledge among all the analysts, both embedded and centralized. Although Zynga’s analysts all reside on the same physical campus, a geographically distributed team could simulate “stand up” meetings with virtual Web meetings or conference calls.

Center of Excellence. The book “Analytics at Work” by Tom Davenport, Jeanne Harris, and Robert Morison describes five approaches for organizing analysts, most of which are variations on the themes described above. One approach, “Center of Excellence” is similar to the Hybrid approach above. The differences are that all (not just some) business analysts are embedded in business units, and all are members of (and perhaps report dotted line to) a corporate center of excellence for analytics. Here, the Center of Excellence functions more like a program office that coordinates activities of dispersed analysts rather than a singular, cohesive team, as in the case of Zynga.

Either approach works, although Zynga’s makes it easier for an inspired director of analytics to shape and grow the analytics department quickly and foster a culture of analytics throughout the organization.

Summary. As an organization recognizes the value of analytics, it will evolve the way it organizes its business analysts. Typically, companies will start off on one extreme—either centralized or distributed—and then migrate to a more nuanced hybrid approach in which analysts report directly to a director of analytics (i.e., Zynga) or are part of a corporate Center of Excellence.

Posted by Wayne Eckerson on May 23, 20100 comments


Attracting and Retaining Top BI Professionals

To create high-performance BI teams, we need to attract the right people. There are a couple of ways to do this.

Skills Versus Qualities

Inner Drive. First, don’t just hire people to fill technical slots. Yes, you should demand a certain level of technical competence. For example, everyone on the award winning BI team at Continental Airlines in 2004 had training as a database administrator. But these days, technical competence is simply a ticket to play the game. To win the game, you need people who are eager to learn, highly adaptable, and passionate about what they do.

The bottom line is that you shouldn’t hire technical specialists whose skills may become obsolete tomorrow if your environment changes. Hire people who have inner drive and can reinvent themselves on a regular basis to meet the future challenges your team will face. If you need pure technical specialists, consider outsourcing or contracting people to fill these roles.

Think Big. To attract the right people, it’s important to set ambitious goals. A big vision and stretch targets will attract ambitious people who seek new challenges and opportunities and discourage risk-adverse folks who simply want a “job” and a company to “take care” of them. One way to think big is to run the BI group like a business. Create mission, vision, and values statements for your team and make sure they align with the strategic objectives of your organization. Put people in leadership positions, delegate decision making, and hold them accountable for results. Reward them handsomely for success (monetarily or otherwise) yet don’t punish failure. View mistakes as “coaching opportunities” to improve future performance or as evidence that the team needs to invest more resources into the process or initiative.

Performance-based Hiring

Proactive Job Descriptions. We spend a lot of time measuring performance after we hire people, but we need to inject performance measures into the hiring process itself. To do this, write proactive job descriptions that contain a mission statement, a series of measurable outcomes, and the requisite skills and experience needed to achieve the outcomes. Make sure the outcomes are concrete and tangible: such as “improve the accuracy of the top dozen data elements to 99%” or “achieve a rating of “high” or better from 75% of users in the annual BI customer satisfaction survey” or “successfully deliver 12 major projects a year on time and in budget.”

If done right, a proactive job description helps prospective team members know exactly what they are getting into. They know specific goals they have to achieve and when they have to achieve them. A proactive job description helps them evaluate honestly whether they have what it takes to do the job. It also becomes the new hire’s performance review and a basis for merit pay. For more information on writing these proactive job descriptions, I recommend reading the book, “Who: A Method for Hiring” by Geoff Smart and Randy Street.

Where are they? So where do you find these self-actuated people? Steve Dine, president of DataSource Consulting and a TDWI faculty member, says he tracks people on Twitter and online forums, such as TDWI’s LinkedIn group. He evaluates them by identifying the number of followers they have and assessing the quality of advice they offer. And he avoids the major job boards, such as Monster.com or Dice.com. A more direct and surefire method is to hire someone on a contract basis to see what kind of work they do and how they get along with others on your team.

Retaining the Right People

Finally, to retain your high-performance team, you need to understand what makes BI professionals tick. Salary is always a key factor, but not the most important one. (See TDWI’s annual Salary, Roles, and Responsibility Report.) Above all else, BI professionals want new challenges and opportunities to expand their knowledge and skills. Most get bored if forced to specialize in one area for too long. It’s best to provide BI professionals with ample opportunity to attend training classes so they can pick up new skills.

One way to retain valuable team members is to create small teams responsible for delivering complete solutions. This gives team members exposure to all technologies and skills needed to meet business needs and also gives them ample face time with the business folks who use the solution. BI professionals are more motivated when they understand how their activities contribute to the organization’s overall success. Another retention technique is to give people opportunities to exercise their leadership skills. For instance, assign your rising stars to lead small, multidisciplinary teams where they define the strategy, execute the plans, and report their progress to the team as a whole.

Summary. By following these simple techniques, you will attract the right people to serve on your team and give them ample opportunity to stay with you. For more information on creating high performance BI teams, see a recent entry at my Wayne’s World blog titled “Strategies for Creating High Performance Teams” and the slide deck by the same title at my LinkedIn Page.

Posted by Wayne Eckerson on May 20, 20100 comments


C'mon Dashboard Vendors: Time to Step Up!

I’m perplexed why some BI vendors treat the metrics in their dashboards differently than the metrics in their scorecards. When you look at their scorecard products, all the metrics have targets associated with them and display color-coded traffic lights and trending symbols. But when you examine their dashboards, the metrics are just simply charts and tables without performance context. These vendors act as if dashboards are simply a collection of charts and tables—a metrics portal, if you will—not a bonafide performance management system.

To me, if you insert a metric in a dashboard or a scorecard, then it’s worthy enough to be measured against a goal and displayed in all its color-coded glory. Otherwise, why are you measuring the activity at all? If no one cares whether the activity is performed well or not—or whether it’s aligned with strategic, tactical, or operational objectives—what’s the point? You are simply cluttering the dashboard with useless data.

To test my conjecture, I asked the audience at the Palladium Performance Management Conference during a panel session today whether the metrics in their dashboards have goals associated with them (or should anyway). The resounding response was, “Yes!” It turns out that user organizations don’t make the same arbitrary distinctions between scorecards and dashboards that vendors do.

Of course, there was some debate among my fellow panelists about this issue. Sure, there are some metrics, like top ten lists, that are purely diagnostic yet important to include in a dashboard. And some operational users may internalize metric goals because they are so immersed in the processes they manage and the organization doesn’t feel the need to instrument those metrics. Yet, the fundamental principle holds true: every metric in a scorecard AND dashboard should have an associated target.

Displaying Versus Authoring Targets

Most dashboard vendors will say, “Sure, we can display performance targets” and they might be right. But the real question is, do they let business users define the target in the tool itself and then display it? Probably not. And if business users can’t define the targets easily, they won’t.

The workaround that vendors often propose is clumsy. They’ll say, “Get the IT department to revise the data warehouse data model to support performance targets, status, and trend indicators. Then have them create new extract routines that pull the targets from a planning system or spreadsheet into the data warehouse. Or failing that, have the business manager email the targets to an IT person who can manually add them to the database.” Yeah right. Who’s going to do that?

Ok, maybe in a true enterprise performance management environment, you want the targets to emanate from a corporate planning system and flow through the enterprise data warehouse to a local mart and into the dashboard. But honestly, few organizations are that sophisticated yet. So in the meantime, dashboard vendors have no excuse for not natively supporting the creation and display of performance targets.

Do Me a Favor. So the next time you talk to a dashboard vendor, do me a favor. When they demo their product, ask them where their targets and traffic lights are. Then, ask them to show you how a business person can attach a target to one of the charts or tables in the dashboard. If they start squirming, stammering, or sweating, then you know you are looking at a metrics portal not a real performance management system. And let me know how you make out!

Posted by Wayne Eckerson on May 11, 20100 comments


Teaching the Business to Tango

Author’s Note: Given the positive reaction to my last blog, “Purple People: The Key to BI Success”, I thought I’d publish an excerpt from the second edition of my book, “Performance Dashboards: Measuring, Monitoring, and Managing Your Business”  due out this fall. This excerpt focuses on the responsibility of the business to close the business-IT gap, something that doesn’t get enough attention these days.

Aligning Business and IT

Tension Abounds. There has always been distrust between the business and the technical sides of an organization, but performance dashboard projects seem to heighten the tension to extreme levels. I have been in the technology industry for 20 years, and frankly, I’ve been shocked by the intensity of the distrust that I have witnessed between these two groups while researching this book.

Although there is much talk about the need to align business and information technology (IT) departments, little progress has been made. Part of the problem is systemic to IT departments and technical people, but another part involves the willingness of business executives and managers to engage with IT constructively on a long-term basis….

Counseling for Business

Although IT groups generally get the lion’s share of the blame for misalignment between business and IT, it takes two to tango, as they say. The business shares equal responsibility for the tension between the two groups—perhaps more so, since it does not always recognize how its actions and behavior contribute to the problem.

The business needs to understand that it often moves too fast for IT to keep up. It harbors a short-term bias toward action and rarely takes a long-term view toward building sustainable value. This is especially true in U.S. companies, whose Wild West heritage makes them notorious for acting first and asking questions later. The business needs to slow down sometimes and ask whether change is really needed or if they are reacting in knee-jerk fashion to the latest event or issue of the day. They need to prioritize their needs and evaluate what they need now and what can wait. By working incrementally, the business discovers that it gets what it needs faster than by trying to build a solution all at once.

Decentralized organizations magnify this behavior, parceling out authority to divisions and departments to make decisions faster and in the context of local markets. Although there are advantages to decentralization, there are considerable downsides that contribute to the perpetual misalignment of the business and IT. The scores of analytical and operational silos, including the hundreds and thousands of pernicious spreadmarts that hamstring corporate productivity, testify to the business’ fixation with speed and decentralized decision making.

Lack of Self Discipline. Finally, the business has the upper hand in its relationship with IT, and it often leverages its position in a high-handed and capricious manner. In many organizations, executives threaten to outsource or offshore IT when it does not deliver sufficient value, ignoring the possibility that their own actions and decisions may have crippled IT’s ability to function effectively. The business also lacks a reasonable degree of restraint and self-discipline when it comes to IT projects. One IT manager said his company’s annual technology planning process is a sham because the business cannot discipline itself to live within its limits.

“Prior to the beginning of every calendar year, the business prioritizes IT projects for the next 12 months. Out of 90 projects, they identify 60 of them as ‘high priority’ and we create a schedule to deliver them,” says the beleaguered IT manager. “But even before January 1st arrives, the business adds 20 more ‘high-priority’ projects to our list and adds another 20 projects before April. And then they tell us in March that we are already two months behind schedule!”

There are many ways to align business and IT and create a lasting sustainable partnership. In the world of BI, these range from gathering better business requirements and adopting new agile development methodologies to creating a BI governance program and creating a BI Center of Excellence that leverages business analysts and super users. My book and blogs go into detail on these and other techniques. Stay tuned!

 

Posted by Wayne Eckerson on May 6, 20100 comments


"Purple People": The Key to BI Success

I went to a small, liberal arts college in western Massachusetts whose mascot is a “purple cow” -- presumably chosen because of the large number of cows that graze in the foothills of nearby mountains that glisten a faint purple as the afternoon sun fades into twilight.

Although our mascot didn’t strike fear in the hearts of our athletic opponents, that was fine by us. We were an academic institution first and foremost. But, it didn’t hurt that our sports teams tended to win more often than not.

We loved our “purple cow” mascot because this bit of serendipity made it hard for others to classify us: most of us so-called "purple people" weren’t just students or athletes or artists but a versatile blend of the three.

I’ve been a “purple person” for some time. And now I want to invite you to be one, too.

Of course, I am not asking you to enroll in my alma mater. I think most of us are too old to qualify at this point! What I am asking, however, is for you to exhibit the versatility of mind and experience that is required to deliver a successful business intelligence (BI) solution.

The Color Purple

The color purple is formed by mixing two primary colors: red and blue. These colors symbolize strong, distinct, and independent perspectives. In the world of BI, let’s say that “red” stands for the BI technologists and “blue” represents the business departments.

In most organizations, these two groups are at loggerheads. Neither side trusts or respects the other. This is largely because neither understands the pressures, deadlines, and challenges that the other faces. And, there is a yawning cultural gulf between the two groups that magnifies their mutual hostility: they speak a different language, report to different executives, travel in different social circles, and possess different career ambitions.

In contrast, a purple person is neither red nor blue. They are neither pure technologist, nor pure business; they are a blend of both. They are “purple people”!

BI requires “purple people” to succeed. Business intelligence is not like most IT disciplines; it requires a thorough and ongoing understanding of business issues, processes, tactics, and strategy to succeed. BI is about delivering information that answers business questions. And since those questions change from day to day and week to week and are often shaped by the larger market landscape, BI solutions can’t succeed unless they continuously adapt.

The only way to create “adaptable systems”-- intrinsically a contradiction in terms--is to find people who are comfortable straddling the worlds of business and technology. These “purple people” can speak the language of business and translate that into terms that IT people can understand. Conversely, they can help business people understand how to exploit the organization’s information repository and analytical tools to solve pressing business problems.

Purple people are key intermediaries who can reconcile business and IT and forge a strong and lasting partnership that delivers real value to the organization.

Finding “Purple People”

So, where can you find “purple people”? Although I’m proud of my college, I wouldn’t suggest that you go there to find them, as smart and ambitious as they might be. In fact, I wouldn’t seek out any college graduates, or by extension, the junior staff of large systems integrators. These folks lack experience in both “red” and “blue” camps. At best, they serve as translators; but lacking real-world knowledge and experience, they always lose something in translation.

The best “purple people” are the proverbial switch hitters. They have strong credentials and a solid reputation in either a business or IT department, and then they switch sides. Their versatility creates an immediate impact.

For example, a financial analyst who joins the BI team brings with him or her knowledge of the finance department and its people, processes, and challenges. They can speak candidly and clearly with former colleagues when hashing out issues and clarifying requirements. They know the difference between “needs” and “wishes” and can create a realistic priority list that works for both sides.

BI directors can straddle both worlds by spending as much time talking with business counterparts as with the technologists on their team. In addition, they should act like business people and manage the BI department like a business: they should craft a strategy document that specifies the department’s mission, vision, values, and strategy and establish metrics for customer success and measure performance continuously.

And, most importantly, BI directors should recruit business people from every department to serve on their BI teams and serve as liaisons to their former departments. These “purple people” form the glue that bonds BI team and business together.

Summary

By definition, business intelligence leverages information technology to drive business insight. As such, pure technologists or pure business people can’t harness BI successfully. BI needs “purple people” to forge tight partnerships between business people and technologists and harness information for business gain.

People often ask me about career opportunities in BI. It should be obvious by now, but my answer is: “Become a purple person. If you straddle business and technology, you are indispensible.”

Posted by Wayne Eckerson on April 29, 20100 comments


12 Characteristics of Effective Metrics

Creating performance metrics is as much art as science. To guide you in your quest, here are 12 characteristics of effective performance metrics.

1. Strategic. To create effective performance metrics, you must start at the end point--with the goals, objectives or outcomes you want to achieve--and then work backwards. A good performance metric embodies a strategic objective. It is designed to help the organization monitor whether it is on track to achieve its goals. The sum of all performance metrics in organization (along with the objectives they support) tells the story of the organization’s strategy.

2. Simple. Performance metrics must be understandable. Employees must know what is being measured, how it is calculated, what the targets are, how incentives work, and, more importantly, what they can do to affect the outcome in a positive direction. Complex KPIs that consist of indexes, ratios, or multiple calculations are difficult to understand and, more importantly, not clearly actionable.

"We hold forums where we show field technicians how our repeat call metric works and how it might impact them. We then have the best technicians meet with others to discuss strategy and techniques that they use to positively influence the metric," says a director of customer management at an energy services provider.

3. Owned. Every performance metric needs an owner who is held accountable for its outcome. Some companies assign two or more owners to a metric to engender teamwork. Companies often embed these metrics into job descriptions and performance reviews. Without accountability, measures are meaningless.

4. Actionable. Metrics should be actionable. That is, if a metric trends downward, employees should know what corrective actions to take to improve performance. There is no purpose in measuring activity if users cannot change the outcome. Showing that sales are falling isn’t very actionable; showing that sales of a specific segment of customers is falling compared to others is more actionable.

Actionable metrics require employees who are empowered to take action. Managers must delegate sufficient authority to subordinates so they can make decisions on their own about how to address situations as they arise. This seems obvious, but many organizations hamstring workers by circumscribing the actions they can take to meet goals. Companies with hierarchical cultures often have difficulty here, especially when dealing with front-line workers whose actions they have historically scripted. These companies need to replace scripts with guidelines that give users more leeway to solve problems in their own novel ways.

5. Timely. Actionable metrics require timely data. Performance metrics must be updated frequently enough so the accountable individual or team can intervene to improve performance before it is too late. Some people argue that executives do not need actionable or timely information because they primarily make strategic decisions for which monthly updates are good enough. However, the most powerful change agent in an organization is a top executive armed with an actionable KPI.

6. Referenceable. For users to trust a performance metric, they must understand its origins. This means every metric should give users the option to view its metadata, including the name of the owner, the time the metric was last updated, how it was calculated, systems of origin, and so on. Most BI professionals have learned the hard way that if users don’t trust the data, they won’t use it. The same is true for performance metrics.

7. Accurate. It is difficult to create performance metrics that accurately measure an activity. Part of this stems from the underlying data, which often needs to be scanned for defects, standardized, deduped, and integrated before displaying to users. Poor systems data creates lousy performance metrics that users won’t trust. Garbage in, garbage out. Companies should avoid creating metrics when the condition of source data is suspect.

Accuracy is also hard to achieve because of the way metrics are calculated. For example, a company may see a jump in worker productivity, but the increase is due more to an uptick in inflation than internal performance improvements. This is because the company calculates worker productivity by dividing revenues by the total number of workers. Thus, a rise in the inflation rate, which artificially boosts revenues—which is the numerator in the metric—increases worker productivity even though workers did not become more efficient.

Also, it is easy to create metrics that do not accurately measure the intended objective. For example, many organizations struggle to find a metric to measure employee satisfaction or dissatisfaction. Some might ask users in surveys but it’s unclear whether employees will answer questions truthfully. Others might use the absenteeism rate but this might be skewed by employees who miss work to attend a funeral, care for sick family members, or stay home when daycare is unavailable.

8. Correlated. Performance metrics are designed to drive desired outcomes. Many organizations create performance metrics but never calculate the degree to which they influence the behaviors or outcomes they want. Companies must continually refresh performance metrics to ensure they drive the desired outcomes.

9. Game-proof. Organizations need to test all performance metrics to ensure that workers can’t circumvent them out of laziness or greed or go through the motions to make a red light turn green without making substantive changes. “Users always look for loopholes in your metrics,” says one BI manager. To prevent users from “fudging” customer satisfaction numbers, one company hires a market research firm to audit customer surveys.

10. Aligned. It’s important that performance metrics are aligned with corporate objectives and don’t unintentionally undermine each other, a phenomenon called “sub-optimization.” To align metrics, you need to devise them together in the context of an entire ecosystem designed to drive certain behaviors and avoid others.

11. Standardized. A big challenge in creating performance metrics is getting people to agree on the definitions of terms, such as sales, profits, or customer, that comprise most of the metrics. Standardizing terms is critical if organizations are going to distribute performance dashboards to different groups at multiple levels of the organization and roll up the results. Without standards, the organization risks spinning off multiple, inconsistent performance dashboards whose information cannot be easily reconciled.

12. Relevant. A performance metric has a natural life cycle. When first introduced, the performance metric energizes the workforce and performance improves. Over time, the metric loses its impact and must be refreshed, revised, or discarded.

"We usually see a tremendous upswing in performance when we first implement a scorecard application,” says a program manager at a major high tech company. “But after a while, performance trails off. In the end you can’t control people, so you have to continually reeducate them about the importance of the processes that the metrics are measuring or you have to change the processes."

Posted by Wayne Eckerson on April 19, 20100 comments