By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Blog

Business Intelligence Blog Posts

See the most recent Business Intelligence related items below.


C'mon Dashboard Vendors: Time to Step Up!

I’m perplexed why some BI vendors treat the metrics in their dashboards differently than the metrics in their scorecards. When you look at their scorecard products, all the metrics have targets associated with them and display color-coded traffic lights and trending symbols. But when you examine their dashboards, the metrics are just simply charts and tables without performance context. These vendors act as if dashboards are simply a collection of charts and tables—a metrics portal, if you will—not a bonafide performance management system.

To me, if you insert a metric in a dashboard or a scorecard, then it’s worthy enough to be measured against a goal and displayed in all its color-coded glory. Otherwise, why are you measuring the activity at all? If no one cares whether the activity is performed well or not—or whether it’s aligned with strategic, tactical, or operational objectives—what’s the point? You are simply cluttering the dashboard with useless data.

To test my conjecture, I asked the audience at the Palladium Performance Management Conference during a panel session today whether the metrics in their dashboards have goals associated with them (or should anyway). The resounding response was, “Yes!” It turns out that user organizations don’t make the same arbitrary distinctions between scorecards and dashboards that vendors do.

Of course, there was some debate among my fellow panelists about this issue. Sure, there are some metrics, like top ten lists, that are purely diagnostic yet important to include in a dashboard. And some operational users may internalize metric goals because they are so immersed in the processes they manage and the organization doesn’t feel the need to instrument those metrics. Yet, the fundamental principle holds true: every metric in a scorecard AND dashboard should have an associated target.

Displaying Versus Authoring Targets

Most dashboard vendors will say, “Sure, we can display performance targets” and they might be right. But the real question is, do they let business users define the target in the tool itself and then display it? Probably not. And if business users can’t define the targets easily, they won’t.

The workaround that vendors often propose is clumsy. They’ll say, “Get the IT department to revise the data warehouse data model to support performance targets, status, and trend indicators. Then have them create new extract routines that pull the targets from a planning system or spreadsheet into the data warehouse. Or failing that, have the business manager email the targets to an IT person who can manually add them to the database.” Yeah right. Who’s going to do that?

Ok, maybe in a true enterprise performance management environment, you want the targets to emanate from a corporate planning system and flow through the enterprise data warehouse to a local mart and into the dashboard. But honestly, few organizations are that sophisticated yet. So in the meantime, dashboard vendors have no excuse for not natively supporting the creation and display of performance targets.

Do Me a Favor. So the next time you talk to a dashboard vendor, do me a favor. When they demo their product, ask them where their targets and traffic lights are. Then, ask them to show you how a business person can attach a target to one of the charts or tables in the dashboard. If they start squirming, stammering, or sweating, then you know you are looking at a metrics portal not a real performance management system. And let me know how you make out!

Posted by Wayne Eckerson on May 11, 20100 comments


"Purple People": The Key to BI Success

I went to a small, liberal arts college in western Massachusetts whose mascot is a “purple cow” -- presumably chosen because of the large number of cows that graze in the foothills of nearby mountains that glisten a faint purple as the afternoon sun fades into twilight.

Although our mascot didn’t strike fear in the hearts of our athletic opponents, that was fine by us. We were an academic institution first and foremost. But, it didn’t hurt that our sports teams tended to win more often than not.

We loved our “purple cow” mascot because this bit of serendipity made it hard for others to classify us: most of us so-called "purple people" weren’t just students or athletes or artists but a versatile blend of the three.

I’ve been a “purple person” for some time. And now I want to invite you to be one, too.

Of course, I am not asking you to enroll in my alma mater. I think most of us are too old to qualify at this point! What I am asking, however, is for you to exhibit the versatility of mind and experience that is required to deliver a successful business intelligence (BI) solution.

The Color Purple

The color purple is formed by mixing two primary colors: red and blue. These colors symbolize strong, distinct, and independent perspectives. In the world of BI, let’s say that “red” stands for the BI technologists and “blue” represents the business departments.

In most organizations, these two groups are at loggerheads. Neither side trusts or respects the other. This is largely because neither understands the pressures, deadlines, and challenges that the other faces. And, there is a yawning cultural gulf between the two groups that magnifies their mutual hostility: they speak a different language, report to different executives, travel in different social circles, and possess different career ambitions.

In contrast, a purple person is neither red nor blue. They are neither pure technologist, nor pure business; they are a blend of both. They are “purple people”!

BI requires “purple people” to succeed. Business intelligence is not like most IT disciplines; it requires a thorough and ongoing understanding of business issues, processes, tactics, and strategy to succeed. BI is about delivering information that answers business questions. And since those questions change from day to day and week to week and are often shaped by the larger market landscape, BI solutions can’t succeed unless they continuously adapt.

The only way to create “adaptable systems”-- intrinsically a contradiction in terms--is to find people who are comfortable straddling the worlds of business and technology. These “purple people” can speak the language of business and translate that into terms that IT people can understand. Conversely, they can help business people understand how to exploit the organization’s information repository and analytical tools to solve pressing business problems.

Purple people are key intermediaries who can reconcile business and IT and forge a strong and lasting partnership that delivers real value to the organization.

Finding “Purple People”

So, where can you find “purple people”? Although I’m proud of my college, I wouldn’t suggest that you go there to find them, as smart and ambitious as they might be. In fact, I wouldn’t seek out any college graduates, or by extension, the junior staff of large systems integrators. These folks lack experience in both “red” and “blue” camps. At best, they serve as translators; but lacking real-world knowledge and experience, they always lose something in translation.

The best “purple people” are the proverbial switch hitters. They have strong credentials and a solid reputation in either a business or IT department, and then they switch sides. Their versatility creates an immediate impact.

For example, a financial analyst who joins the BI team brings with him or her knowledge of the finance department and its people, processes, and challenges. They can speak candidly and clearly with former colleagues when hashing out issues and clarifying requirements. They know the difference between “needs” and “wishes” and can create a realistic priority list that works for both sides.

BI directors can straddle both worlds by spending as much time talking with business counterparts as with the technologists on their team. In addition, they should act like business people and manage the BI department like a business: they should craft a strategy document that specifies the department’s mission, vision, values, and strategy and establish metrics for customer success and measure performance continuously.

And, most importantly, BI directors should recruit business people from every department to serve on their BI teams and serve as liaisons to their former departments. These “purple people” form the glue that bonds BI team and business together.

Summary

By definition, business intelligence leverages information technology to drive business insight. As such, pure technologists or pure business people can’t harness BI successfully. BI needs “purple people” to forge tight partnerships between business people and technologists and harness information for business gain.

People often ask me about career opportunities in BI. It should be obvious by now, but my answer is: “Become a purple person. If you straddle business and technology, you are indispensible.”

Posted by Wayne Eckerson on April 29, 20100 comments


Strategies for Creating a High-Performance BI Team

A key element in the success of any business intelligence (BI) program lies in the team you create to deliver solutions and the clout it has within the organization to secure resources and get things done. Although there is no right or wrong way to organize a BI team, there are some key principles that are worth knowing. The following the seven guidelines can help you create a high-performance BI team that delivers outstanding value to your organization.

1. Recruit the best people. Jim Collins in his best-selling book “Good to Great” says great companies first get the right people “on the bus” (and the wrong people off it) and then figure out where to drive it. Collins says that the right people will help an organization figure out and execute a strategy, which is a much more effective approach than recruiting people based on their skills and experience to support a strategy which might become obsolete in short order.

From a BI perspective, this means we shouldn’t hire people just because they know a specific tool or programming language or have previous experience managing a specific task, such as quality assurance. If you need specialists like that, it’s better to outsource such positions to a low-cost provider on a short-term contractual basis. What you really want are people who are ambitious, adaptable, and eager to learn new skills. Although you should demand certain level of technical competency and know-how, you ultimately want people who fundamentally believe that BI can have a transformative effect on the business and possess the business acumen and technical capabilities to make that happen.

Collins adds that the right people “don’t need to be tightly managed or fired up; they will be self--motivated by the inner drive to produce the best results and be part of something great.” In a BI setting, these people won’t just do a job; they’ll figure out the issues and work proactively to get things done. Sure, you’ll have to pay them a lot and provide upward career mobility, but a handful of the “right people” will produce more than a dozen “so-so” people.

2. Create multi-disciplinary teams. In most early stage BI teams, a handful of highly motivated people play a myriad roles: the BI project manager serves as the BI architect and requirements analyst; the BI report developer also builds ETL code and perform quality assurance checks; the technical writer provides training and support. With enough talent, resources, and hutzpah, these small, cross-disciplinary teams deliver fantastic results.

And succeeds breeds bigger budgets, more staff, and greater specialization. In most cases, linear development by groups of specialists replaces small, multi-disciplinary teams. The specialist subgroups (e.g., requirements, modeling, ETL, data warehousing, report development, support) operate in isolation, throwing their output “over the wall” to the next group in the BI assembly line. This industrial era approach inevitably becomes mired in its own processes and project backlogs grow bigger. The once nimble, multi-disciplinary BI team loses penchant for speed and loses its luster in the eyes of the business. (See “Revolutionary BI: When Agile Isn’t Fast Enough.”)

The key to remaining nimble and agile as your BI team grows is to recreate the small, multi-disciplinary teams from your early stage BI initiative. Assign three to five people responsibility for delivering an entire BI solution from source to report. Train them in agile development techniques so they work iteratively with the business to deliver solutions quickly. With multiple, multidisciplinary teams, you may need to reset the architecture once in awhile to align what teams are building on the ground, but this tradeoff is worth it.

Multi-disciplinary teams work collaboratively and quickly to find optimal solutions to critical problems, such as whether to code rules in a report, the data model, or the ETL layer. They also provide staff more leadership opportunities and avenues for learning new skills and development techniques. By providing more opportunities for learning and growth, you will increase your staff’s job satisfaction and loyalty. The most productive BI teams that I’ve seen have worked together for 10 to 15 years.

3. Establish BI Governance. Once a BI team has achieved some quick wins, it needs to recruit the business to run the BI program while it assumes a supportive role. The key indicator of the health of a BI program is the degree to which the business assumes responsibility for its long-term success. Such commitment is expressed in a formal BI governance program.

Most BI governance programs consist of two steering committees that meet regularly to manage the BI initiative. An executive steering committee comprised of BI sponsors from multiple departments meets quarterly to review the BI roadmap, prioritize projects, and secure funding. Second, a working committee comprised of business analysts (i.e., subject matter experts who are intensive consumers of data) meets weekly or monthly to define the BI roadmap, hash out DW definitions and subject areas, suggest enhancements, and select products.

The job of the BI team is to support the two BI governance committees in a reciprocal, trusting relationship. The BI team builds and maintains what the committees want to deploy while the BI team educates the sponsors about the potential of BI to transform their processes and what solutions are feasible at what cost. The BI also provides continuous education about emerging BI trends and technologies that have potential to reinvent the business.

4. Find Purple People. The key to making BI governance programs work is recruiting people who can straddle the worlds of business and information technology (IT). These people are neither blue (i.e., business) nor red (i.e., IT) but a combination of both. These so-called purple people can speak both the language of business and data, making them perfect intermediaries between the two groups.

Astute BI directors are always looking for potential purple people to recruit to their teams. Purple people often hail from the business side where they’ve served as a business analyst or a lieutenant to a BI sponsor. Their personal ties to the people in a department, such as finance, coupled with deep knowledge of business processes and data gives them instant credibility with the business. They spend most of their time in the functional area, talking with business leaders and sitting on advisory boards where they share insights about how the business can best leverage BI to accomplish its goals.

5. Aspire to Becoming a Solutions Provider. The best BI teams aren’t content simply to provision data. BI directors know that if the business is to reap the full value of the BI resource, their teams have to get involved in delivering BI solutions. (See “Evolving Your Team from a Data Provider to a Solutions Provider.”) Although some departments may want to build their own reports and applications, few can sustain the expertise to full exploit of DW and BI capabilities to exploit information as a critical resource.

High-performance BI teams work with each department to build a set of standard interactive reports or dashboards that meet 60% to 80% of the needs of casual users in the department. They then train and support each department’s “Super Users” -- tech-savvy business users or business analysts--to use self-service BI tools to create ad hoc reports on behalf of the casual users in the department, meeting the remaining 20% to 40% of their information requirements. The Super Users, in effect, become extensions of the BI team in each department and provide an additional set of “eyes and ears” to keep track of what’s going on from a BI perspective.

6. Give Your BI Team a Name. A name is a powerful thing that communicates meaning and influences perception. Most business people don’t know what business intelligence or analytics is (or may have faulty notions or ideas that don’t conform with the mission of your team.) So spend time considering appropriate names that clearly communicate what your group does and why it’s important to the business.

For example, a group that created predictive models in a large financial services firm called itself the “Marketing Analytics and Business Insights” group. But it discovered that that people didn’t know what the word “analytics” meant or how it differed from “business insights.” Some department heads were resentful of its million dollar budget and the group felt it continually had to defend itself. So it came up with the tagline: “We analyze information to provide usable insights to the organization.” This branding significantly improved the perceived value of the group and it became a “go to” source for information inside the company.

7. Position BI within an Information Management Department. Finally, the BI team should be organized within a larger information management (IM) department that is separate from IT and reports directly to the CIO or COO. The IM department is responsible for all information-driven applications that support the business. These may include: data warehousing, business intelligence, performance management, advanced analytics, spatial analytics, customer management, and master data management.

The IM department maintains a close alliance with the IT department, whose responsibilities include managing the core computing and networking infrastructure that the IM group uses. For instance, the IM group is responsible for designing and managing the data warehouse, while the IT group is responsible for tuning and operating the physical databases that run the data warehouse and the data center in which all data processing occurs.

Separating IM from IT provides a clear signal to the business that these are two separate domains that require different skill sets and career paths. People in the IM group are much more business- and information-driven than those in the IT department, who are more technology focused.

By following these seven steps, you can transform your BI team into a high-performance organization that earns the respect of the business and enjoys sustained success.

Posted by Wayne Eckerson on March 27, 20100 comments


Three Tiers of Analytic Sandboxes: New Techniques to Empower Business Analysts

Analytic sandboxes are proving to be a key tactic in liberating business analysts to explore data while preventing the proliferation of spreadmarts and renegade data marts. Many BI teams already provide sandboxes of some sort, but few recognize that there are three tiers of sandboxes that can be deployed individually or in concert to meet the unique needs of every organization

Analytic sandboxes adhere to the maxim, “If you can’t beat them, join them.” They provide a “safe haven” for business analysts to explore enterprise data, combine it with local and external data, and then massage and package the resulting data sets without jeopardizing an organization’s proverbial “single version of truth” or adversely affecting performance for general DW users.

By definition, analytic sandboxes are designed for exploratory analysis, not production reporting or generalized distribution. Ideally, sandboxes come with an expiration date (e.g. 90 days), reinforcing the notion that they are designed for ad hoc analyses, not application development. If analysts want to convert what they’ve created into a scheduled report or application, they need to turn it over to the BI team to “productionize” it.

Unfortunately, analytic sandboxes can’t enforce information policies. Analysts can still export data sets to their desktop machines, email results to colleagues, and create unauthorized production applications. Ultimately, organizations that establish sandboxes must establish policies and procedures for managing information in a consistent manner and provide sufficient education about proper ways to produce and distribution information. Nonetheless, many BI teams are employing analytic sandboxes with reasonable success.

Tiers of Sandboxes

1. DW-Centric Sandboxes. The traditional analytic sandbox carves out a partition within the data warehouse database, upwards of 100GB in size, in which business analysts can create their own data sets by combining DW data with data they upload from their desktops or import from external sources. These DW-centric sandboxes preserve a single instance of enterprise data (i.e., they don’t replicate DW data), make it easier for database and DW administrators to observe what analysts are doing, and help analysts become more comfortable working in a corporate data environment. It’s also easier for the BI team to convert analyses into production applications since the analytic output is already housed in the DW.

However, a DW-centric sandbox can be difficult to manage from a systems perspective. Database administrators must create and maintain partitions and access rights and tune workload management utilities to ensure adequate performance for both general DW users and business analysts. An organization that has dozens or hundreds of analysts, each of whom wants to create large data sets and run complex queries, may bog down performance even with workload management rules in place. Inevitably, the BI team may need to upgrade the DW platform at considerable expense to support the additional workload.

2. Replicated Sandboxes. One way to avoid performance problems and systems management complexities is to replicate the DW to a separate platform designed exclusively for analysts. Many companies have begun to physically separate the production DW from ad hoc analytical activity by purchasing specialized DW appliances.

This approach offloads complex, ad hoc queries issued by a handful of people to a separate machine, leaving the production DW to support standardized report delivery, among other things. DW performance improves significantly without a costly upgrade, and analysts get free reign of a box designed exclusively for their use.

Of course, the downside to this is cost and duplication of data. Organizations must purchase, install, and maintain a separate database platform--which may or may not run the same database and server hardware as the DW. Executives may question why they need a separate machine to handle tasks they thought the DW was going to handle.

In addition, the BI team must establish and maintain a utility to replicate the data to the sandbox, which may take considerable expertise to create and maintain. The replication can be done at the source systems, the ETL layer, the DW layer (via mirrored backup), or the DW storage system. Also, with multiple copies of data, it’s easy for the two systems to get out of sync and for analysts to work with outdated information.

3. Managed Excel Sandboxes. The third tier of analytic sandbox runs on the desktop. New Excel-based analytical tools, such as Microsoft’s PowerPivot and Lyzasoft’s Lyza Workstation, contain in-memory columnar databases that run on desktop machines, giving analysts unheralded power to access, massage, and analyze large volumes of data in a manner that conforms to the way they’ve traditionally done such work (i.e., using Excel versus SQL.)

Although these spreadsheets-on-steroids seem like a BI manager’s worst nightmare, there is a silver lining: analysts who want to share their results have to publish through a server managed by corporate IT. This is why I call this type of sandbox a “managed Excel” environment.

For example, with Microsoft PowerPivot, analysts publish their results to Microsoft SharePoint, which makes the results available to other users via Excel Services, which is a browser-based version of Excel. Excel Services prevents users from changing or downloading the report, preventing unauthorized distribution. In the same way, Lyzasoft lets analysts publish data to the Lyza Commons, where others can view and comment on the output via browser-based collaborative tools.

Of course, where there is a will there is a way and business analysts can and will find ways to circumvent the publishing and distribution features built into PowerPivot and Lyza workbooks and other managed Excel environments. But the collaborative features of their server-based environments are so powerful and compelling that I suspect most business analysts will take the path of least resistance and share information in this controlled manner.

Combining Sandboxes. A managed Excel sandbox might work well in conjunction with the other two sandboxes, especially if the corporate sandboxes have performance or size constraints. For example, analysts could download a subset of data from a centralized sandbox to their managed Excel application, combine it with local data on their desktops, and conduct their analyses using Excel. If they liked what they discovered, they could then run the analysis against the entire DW within the confines of a centralized sandbox.

Our industry is in the early stages of learning how to make most effective use of analytic sandboxes to liberate power users without undermining information consistency. With three (and perhaps more) types of analytic sandboxes, BI teams can tailor the sandbox experience to meet the unique needs of their organization.

Posted by Wayne Eckerson on March 22, 20100 comments


Evolving Your BI Team from a Data Provider to a Solutions Provider

In her presentation on “BI Roadmaps” at TDWI’s BI Executive Summit last month, Jill Dyche explained that BI teams can either serve as “data providers” or “solutions providers.” Data providers focus on delivering data in the form of data warehouses, data marts, cubes, and semantic layers that can be used by BI developers in the business units to create reports and analytic applications. Solutions providers, on the other hand, go one step further, by working hand-in-hand with the divisions to develop BI solutions.

I firmly believe that BI teams must evolve into the role of solutions provider if they want to succeed long term. They must interface directly with the business, serving as a strategic partner that advises the business on how to leverage data and BI capabilities to solve business problems and capitalize on business opportunities. Otherwise, they will become isolated and viewed as an IT cost-center whose mission will always be questioned and whose budget will always be on the chopping block.

Data Provisioning by Default. Historically, many BI teams become data providers by default because business units already have reporting and analysis capabilities, which they’ve developed over the years in the absence of corporate support. These business units are loathe to turn over responsibility for BI development to a nascent corporate BI group that doesn’t know its business and wants to impose corporate standards for architecture, semantics, and data processing. Given this environment, most corporate BI teams take what they can get and focus on data provisioning, leaving the business units to weave gold out of the data hay they deliver.

Mired Down by Specialization

However, over time, this separation of powers fails to deliver value. The business units lose skilled report developers, and they don’t follow systematic procedures for gathering requirements, managing projects, and developing software solutions. They end up deploying multiple tools, embedding logic into reports, and spawning multiple, inconsistent views of information. Most of all, they don’t recognize the data resources available to them, and they lack the knowledge and skills to translate data into robust solutions using new and emerging BI technologies and techniques, such as OLAP cubes, in-memory visualization, agile methods, dashboard, scorecards, and predictive analytics.

On the flip side, the corporate BI team gets mired down with a project backlog that it can’t seem to shake. Adopting an industrialized assembly line mindset, it hires specialists to handle every phase of the information factory to improve efficiency (e.g. requirements, ETL, cube building, semantic modeling, etc.) yet it can’t accelerate development easily. Its processes have become too rigid and sequential. When divisions get restless waiting for the BI team to deliver, CFOs and CIOs begin to question their investments and put its budget on the chopping block.

Evolving into Solutions Providers

Rethink Everything. To overcome these obstacles, a corporate BI team needs to rethink its mission and the way it’s organized. It needs to actively engage with the business and take some direct responsibility for delivering business solutions. In some cases, it may serve as an advisor to a business unit which has some BI expertise while in others it may build the entire solution from scratch where no BI expertise exists. By transforming itself from a back-office data provider to a front-office solutions developer, a corporate BI team will add value to the organization and have more fun in the process.

It will also figure out new ways to organize itself to serve the business efficiently. To provide solutions assistance without adding budget, it will break down intra-organizational walls and cross-train specialists to serve on cross-functional project teams that deliver an entire solution from A to Z. Such cross-fertilization will invigorate many developers who will seize the chance to expand their skill sets (although some will quit when forced out of their comfort zones). Most importantly, they will become more productive and before long eliminate the project backlog.

A High Performance BI Team

For example, Blue Cross/Blue Shield of Tennessee has evolved into a BI solutions provider over the course of many years. BI is now housed in an Information Management (IM) organization that reports to the CIO and is separate from the IT organization. The IM group consists of three subgroups: 1) the Data Management group 2) the Information Delivery group and 3) the IM Architecture group.

  • The Data Management group is comprised of 1) a data integration team that handles ETL work and data warehouse administration and 2) a database administration team that designs, tunes, and manages IM databases.
  • The Information Delivery group consists of 1) a BI and Performance Management team which purchases, installs, and manages BI and PM tools and solutions and provides training and two customer-facing solutions delivery teams that work with business units to build applications. The first is the IM Health Informatics team that builds clinical analytic applications using reporting, OLAP, and predictive analytics capabilities, and the second is the IM Business Informatics team which builds analytic applications for other internal departments (i.e. finance, sales, marketing).
  • The IM Architecture group builds and maintains the IM architecture, which consists of the enterprise data warehouse, data marts, and data governance programs, as well as closed loop processing and the integration of structured and unstructured data.

Collaborative Project Teams. Frank Brooks, director of data management and information delivery at BCBS of Tennessee, says that the IM group dynamically allocates resources from each IM team to support business-driven projects. Individuals from the Informatics teams serve as project managers, interfacing directly with the customers. (While Informatics members report to the IM group, many spend most of their in the departments they serve.) One or more members from each of the other IM teams (data integration, database administration, and BI/PM) is assigned to the project team and they collaboratively work to build a comprehensive solution for the customer.

In short, the BI team of BCBS of Tennessee has organized itself as a BI solutions provider, consolidating all the functions needed to deliver comprehensive solutions in one group, reporting to one individual who can ensure the various teams collaborate efficiently and effectively to meet and exceed customer requirements. BCBS of Tennessee has won many awards for its BI solutions and will be speaking at this summer’s TDWI BI Executive Summit in San Diego (August 16-18.)

The message is clear: if you want to deliver value to your organization and assure yourself a long-term, fulfilling career at your company, then don’t be satisfied with being just a data provider. Make sure you evolve into a solutions provider that is viewed as a strategic partner to the business.

Posted by Wayne Eckerson on March 16, 20100 comments


Zen BI: The Wisdom of Letting Go

One of my takeaways from last week’s BI Executive Summit in Las Vegas is that veteran BI directors are worried about the pace of change at the departmental level. More specifically, they are worried about how to support the business’ desire for new tools and data stores without undermining the data warehousing architecture and single version of truth they have worked so hard to deliver.

At the same time, many have recognized that the corporate BI team they manage has become a bottleneck. They know that if they don’t deliver solutions faster and reduce their project backlog, departments will circumvent them and develop renegade BI solutions that undermine the architectural integrity of the data warehousing  environment.

The Wisdom of Letting Go

In terms of TDWI’s BI Maturity Model, these DW veterans have achieved adulthood (i.e. centralized development and EDW) and are on the cusp of landing in the Sage stage. However, to achieve true BI wisdom (i.e. Sage stage), they must do something that is both counterintuitive and terrifying: they must let go. They must empower departments and business units to build their own DW and BI solutions.

Entrusting departments to do the right thing is a terrifying prospect for most BI veterans. They fear that the departments will create islands of analytical information and undermine data consistency with which they have worked so hard to achieve. The thought of empowering departments makes them grip the proverbial BI steering wheel tighter. But asserting control at this stage usually backfires. The only option is to adopt a Zenlike attitude and let go.

Trust in Standards

I'm reminded of the advice that Yoda in the movie “Star Wars” provides his Jedi warriors-in-training: “Let go and trust the force.” But, in this case, DW veterans need to trust their standards. That is, the BI standards that they’ve developed in the BI Competency Center, including definitions for business objects (i.e. business entities and metrics), processes for managing BI projects, techniques for developing BI software, and processes and procedures for managing ETL jobs and handling errors, among other things.

Some DW veterans who have gone down this path add the caveat: “trust but verify.” Although educating and training departmental IT personnel about proper BI development is critical, it’s also important to create validation routines where possible to ensure business units conform to standards.

Engage Departmental Analysts

The cagiest veterans also recognize that the key to making distributed BI development work is to recruit key analysts in each department to serve on a BI Working Committee. The Working Committee defines DW and BI standards, technologies, and architectures and essentially drives the BI effort, reporting their recommendations to the BI Steering Committee comprised of business sponsors for approval. Engaging analysts who are most apt to create renegade BI systems ensures the DW serves their needs and helps ensure buy-in and support.

By adopting a Zen like approach to BI, veteran DW managers can eliminate project backlogs, ensure a high level of customer satisfaction, and achieve BI nirvana.

Posted by Wayne Eckerson on February 28, 20100 comments


Reflections from TDWI's BI Executive Summit

More than 150 BI Directors and BI Sponsors from small, medium, and large companies plus a dozen or so sponsors attended TDWI’s BI Executive Summit last week, a record turnout.

Here are a few of the things I learned:

- Veteran BI directors are worried about the pace of change at the departmental level. More specifically, they are worried about how to support the business’ desire for new tools and data stores without undermining the data warehousing architecture and single version of truth they have worked so hard to deliver.(See "Zen BI: The  Wisdom of Letting  Go.")

- Jill Dyche explained that corporate BI teams can either be Data Providers or Solutions Providers. That this is an option is a new concept for me. However, after some thought, I believe that unless BI teams help deliver solutions, the data they provision will be underutilized. Unless the BI team helps solve business problems by delivering business solutions, it can never be viewed as a strategic partner.

- Most BI teams have project backlogs and don’t have a great way to get in front of them. Self-service BI can help eliminate a lot of the onesy and twosey requests for custom reports. BI Porfolios and roadmaps can help prioritize deliverables but executives always override their own priorities. Many veteran BI managers are looking to push more development back into the departments as a way to accelerate projects.

- There is a lot of interest in predictive analytics, dashboards, and the cloud. Those were the top three vote getters to the question, “Which technologies will have the most impact on your BI program in three years?”

- Most of the case studies at the Summit described real-time data delivery environments, often coupled with analytics. GE Rails applied statistical models to real-time data to help customer service agents figure out the optimal repair facility to send railroad cars to get fixed; Linkshare captures and displays Web activity and commissions to external customers (publishers and advertisers), and Seattle Teachers’ Credit Union delivers real-time recommendations to customer service agents.

- There was a lot of interest in how to launch an analytics practice and Aldo Mancini provided some great tips from his experiences at Discover Financial Services. To get SAS analysts to start using the data warehouse as a way to accelerate model development, he had them help design the subject areas and variables that should go into it. Then, he taught them how to use SQL so they could transform data into their desired format.

We’re already gearing up for our next Summit which will be held August 16-18 in San Diego. Hope to see you there!

Posted by Wayne Eckerson on February 28, 20100 comments


Launching an Analytics Practice: Ten Steps to Success

Everyone wants to move beyond reporting to deliver value-added insights through analytics. The problem is that few organizations know where to begin. Here is a ten-step guide for launching a vibrant analytics practice.

Launching the Practice

Step 1: Find an Analyst. You can’t do analytics without an analyst! Most companies have one or more analysts burrowed inside a department. Look for someone who is bright, curious, and understands key business processes inside and out. The analyst should like to work with numbers, have strong Excel, SQL, OLAP, and database skills, and ideally understand some statistics and data mining tools.

Step 2: Find a Business Person. The quickest way to kill an analytics practice is to talk about predictive models, optimization, or statistics with a business person. Instead, find one or more executives who are receptive to testing key assumptions about how the business works. For example, a retail executive might want to know, “Why do some customers stop buying our product?” A social service agency might want to know, “Which spouses are most likely not to pay alimony?” Ask them to dream up as many hypotheses to their questions as possible and then use those as inputs for your analysis.

Step 3: Gain Sponsorship. If step two piqued an executive’s interest, then you have a sponsor. Tell the sponsor what resources you need, if any, to conduct the test. Perhaps you need permission to free up an analyst for a week or two or hire a consultant to conduct the analysis. Ideally, you should be able to make do with people and tools you have inhouse. A good analyst can work miracles with Excel and SQL and there are many open source data mining packages on the market today as well as low cost statistical add-ins for Excel and BI tools.

Step 4: Don’t Get Uppity. “You never want to come across smarter than the executive you are supporting,” says Matthew Schwartz, a former director of business analytics at Corporate Express. Don’t ever portray the model results as “the truth”; executives don’t trust models unless they make intuitive sense or prove their value in dollars and cents. For example, Schwartz was able to get his director of marketing to buy in to the results of a market basket analysis for Web site recommendations because the director recognized the model’s cross-selling logic: “Ah! It knows that people are buying office kits for new employees.”

Step 5: Make It Actionable. A model is worthless if people can’t act on it. This often means embedding the model in an operational application, such as a Web site or customer-facing application, or distributing the results in reports to salespeople or customer service representatives. In either case, you need to strip out the mathematics and decompose the model so it’s understandable and usable by people in the field. For example, a sales report might say, “These five customers are likely to stop purchasing office products from us because they haven’t bought toner in four weeks.”

Step 6: Make It Proactive. The kiss of death for an analytical model is to tell people something they already know. Rather than tell salespeople about customers who are purchasing fewer products than the prior period are likely to churn (like the example I gave in step five above), tell them about customers who will buy fewer products in the future because they have fallen below a critical statistical threshold and are vulnerable to competitive offers. Or, rather than forecast the number loans that will go into default, identify the characteristics of good loans and bake that criteria into the loan origination process. If you deliver results that enable people to work proactively, you’ll become an overnight hero.

Sustaining the Analytics Practice

Let’s assume your initial modeling efforts worked their magic and garnered you strong executive sponsorship. How do you build and sustain an analytics practice? What organizational and technical strategies do you employ to ensure that your analysts are as productive as possible? The following four steps will solidify your analytics practice.

Step 7: Centralize and Standardize the Data. The thing that slows down analysts the most is having to collect data spread across multiple systems and then clean, harmonize, and integrate it. Only then can they start to analyze the data. Obviously, this is what a data warehouse is designed to do, not an analyst. But a data warehouse only helps if it contains all or most of the data analysts need in a format they can readily use so they don’t have to hunt and reconcile data on their own. Typically, analytical modelers need wide, flat tables with hundreds of attributes to create models.

Step 8: Provide Open Access to Data. Data warehouse administrators need to give analysts access to the data warehouse without having to file a request and wait weeks for an answer. Rather than broker access to the data warehouse, administrators should create analytical sandboxes using partitions and workload management that let analysts upload their own data and comingle it with data in the warehouse. This creates an analytical playground for analysts and keeps them from creating renegade data marts under their desks.

Step 9: Centralize Analysts. Contrary to current practice, it’s best to centralize

analysts in an Analytical Center of Excellence under the supervision of a Director of Analytics. This creates a greater sense of community and camaraderie among analysts and gives them more opportunities for advancement within the organization. It also minimizes the chance that they’ll be lured away by recruiters. Although they may be part of a shared services group, analysts should be physically embedded within the departments they support and have dotted line responsibility to those department heads.

Step 10: Offload Reporting. The quickest to undermine the productivity of your top analysts is force them to field requests for ad hoc reports from business users. To eliminate the reporting backlog, the BI team and analysts need to work together to create a self-service BI architecture that empowers business users to generate their own reports and views. When designed properly, these interactive reports and dashboards will meet 60% to 80% of users’ information needs, freeing up business analysts and BI report developers to focus on more value-added activities.

So there you have it, ten steps to analytical nirvana. Easy to write, hard to do! Keep me informed about your analytics journey and the lessons you learn along the way! I’d love to hear your stories. You can reach me at [email protected].

Posted by Wayne Eckerson on February 26, 20100 comments


Mashboards: New Tools for Self-Service BI

There is an emerging type of dashboard product that enables power users to craft ad hoc dashboards for themselves and peers by piecing together elements from existing reports and external Web pages. I’m calling these “Mashboards” because they “mash” together existing charts and tables within a dashboard framework. Other potential terms are “Report Portal,” “Metrics Portal,” and “Dashmart.”

I see Mashboards as the dashboard equivalent of the ad hoc report, which has spearheaded the self-service BI movement in recent years. Vendors began delivering ad hoc reporting tools to ease the report backlog that afflicts most BI deployments and dampens sales of BI tools. Ad hoc reports rely on a semantic layer that enables power users to drag and drop predefined business objects onto a WYSIWYG reporting canvas to create a simple report.

Likewise, Mashboards enable powers to select from predefined report “parts” (e.g., charts, tables, selectors) and drag and drop them onto a WYSIWYG dashboard canvas. Before you can create a Mashboard, IT developers need to create reports using the vendor’s standard report authoring environment. The “report parts” are often self-contained pieces of XML code--or gadgets--that are wired to display predefined sets of data or can be easily associated with data from a semantic layer. Power users can apply filters and selectors to the gadgets without coding.

Mashboards are a great way for organizations to augment enterprise or executive dashboards that are designed to deliver 60% to 80% of casual users’ information needs. The Mashboards can be used to address the other 20% to 40% of those needs on an ad hoc basis or deliver highly personalized dashboard for an executive or manager. (I should note that enterprise dashboards should also be personalizable as well.)

Dashmarts? However, there is a danger that Mashboards will end up becoming just another analytical silo. Their flexibility lends themselves to becoming a visual spreadmart, which is why I’m tempted to call them Dashmarts. However, Mashboards that require power users to source all data elements from existing reports and parts should minimize this to some degree.

All in all, Mashboards are a great additional to a BI portfolio. They provide a new type of ad hoc report that is more visual and easily consumed by casual users. And they are a clever way for vendors to extend the value of their existing reporting and analysis tools.

Posted by Wayne Eckerson on February 16, 20100 comments


Sleep Well at Night: Abstract Your Source Systems

It’s odd that our industry has established a best practice for creating a layer of abstraction between business users and the data warehouse (i.e., a semantic layer or business objects), but we have not done the same thing on the back end.

Today, when a database administrator adds, changes, or deletes fields in a source system, it breaks the feeds to the data warehouse. Usually, source systems owners don’t notify the data warehousing team of the changes, forcing us to scramble to track down the source of the errors, rerun ETL routines, and patch any residual problems before business users awake in the  morning and demand to see their error-free reports.

It’s time we get some sleep at night and create a layer of abstraction that insulates our ETL routines from the vicissitudes of source systems changes. This sounds great, but how?

Insulating Netflix

Eric Colson, whose novel approaches to BI appeared two weeks ago in my blog “Revolutionary BI: When Agile is Not Fast Enough,” has found a simple way to abstract source systems at Netflix. Rather than pulling data directly from source systems, Colson’s BI team pulls data from a file that source systems teams publish to. It’s kind of an old-fashioned publish-and-subscribe messaging system that insulates both sides from changes in the other.

“This has worked wonderfully with the [source systems] teams that are using it so far,” says Colson, who believes this layer of abstraction is critical when source systems change at breakneck speed, like they do at Neflix. “The benefit for the source systems team is that they get to go as fast as they want and don’t have to communicate changes to us. One team migrated a system to the cloud and never even told us! The move was totally transparent.”

On the flip side, the publish-and-subscribe system alleviates Colson’s team from having to 1) access to source systems 2) run queries on those systems 3) know the names and logic governing tables and columns in those systems and 4) keep up with changes in the systems. They also get much better quality data from source systems in this way.

Changing Mindsets

However, Colson admits that he might get push back from some source systems teams. “We are asking them to do more work and take responsibility for the quality of data they publish into the file,” says Colson. “But this gives them a lot more flexibility to make changes without having to coordinate with us.” If the source team wants to add a column, it simply appends it to the end of the file. 

This approach is a big mindset change from the way most data warehousing teams interface with source systems teams. The mentality is: “We will fix whatever you give us.” Colson’s technique, on the other hand, forces the source systems teams to design their databases and implement changes with downstream analysis in mind. For example, says Colson, “they will inevitably avoid adding proprietary logic and other weird stuff that would be hard to encapsulate in the file.”

Time to Deploy

Call me a BI rube, but I’ve always assumed that BI teams by default create such an insulating layer between their ETL tools and source systems. Perhaps for companies that don’t operate at the speed of Netflix, ETL tools offer enough abstraction. But, it seems to me that Colson’s solution is a simple, low-cost way to improve the adaptability and quality of data warehousing environments that everyone can and should implement.

Let me know what you think!

Posted by Wayne Eckerson on February 8, 20100 comments