By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Blog

Wayne Eckerson

Reflections on the practice of business intelligence.

Wayne Eckersonby Wayne Eckerson Eckerson is the author of many in-depth reports, a columnist for several business and technology magazines, and a noted speaker, blogger, and the author of the best-selling book Performance Dashboards: Measuring, Monitoring, and Managing Your Business (John Wiley & Sons, 2005) and TDWI’s BI Maturity Model.


Proactive Analytics That Drive the Business

“I love the chart, but what am I supposed to do about it?” With that simple question, Ken Rudin is schooling analysts at Zynga how to deliver information that makes a difference in the way the wildly successful gaming company creates and enhances games for customers.

“My mantra these days is ‘It’s gotta be actionable,’” says Rudin, former CEO of the early BI SaaS vendor LucidEra who now runs analytics at Zynga, creators of Farmville, Mafia Wars, and other popular applications for Facebook, iPhone, and other networks. “Just showing that revenue is down doesn’t help our product managers improve the games. But if we can show the lifecycle with which a subgroup uses the game, we can open their eyes to things they never realized before.”

It’s surprising that Rudin has to do any analytics tutoring at Zynga. Its data warehouse is a critical piece of its gaming infrastructure, providing recommendations to players based on profiles compiled daily in the data warehouse and cached to memory. With over 40 million players and 3TB of new data a day, Zynga’s 200-node, columnar data warehouse from Vertica is no analytical windup toy. If it goes down for a minute, all hell breaks out because product managers have no visibility into game traffic and trends.

Moreover, the company applies A/B testing to every new feature before deploying and has a bevy of statisticians who continually dream up ways that product managers can enhance games to improve retention and collaboration among gaming users. “I’ve never seen a company that is so analytically driven. Sometimes I think we are an analytics company masquerading as a gaming company. Everything is run by the numbers,” says Rudin.

Anticipating Questions

Yet, when Rudin came to Zynga in early 2009, he discovered the analytics team was mostly in reaction mode, taking orders from product managers for custom reports. So, he split the team into two groups: 1) a reporting team that creates reports for product managers and 2) an analytics team that tests hypotheses and creates models using statistical and analytical methods. A third part of his team runs the real-time, streaming data warehousing environment.

The reporting team currently uses a home grown SQL-based tool for creating parameterized reports. Rudin hopes to migrate them to a richer, self-service dashboard environment that delivers most of the routine information that product managers need and the ability to generate ad hoc views without the help of a SQL professional.

Rudin is encouraging the analytics team to be more proactive. Instead of waiting for product managers to submit requests for hypotheses to test, analysts should suggest gaming enhancements that increase a game's "stickiness" and customer satisfaction.  “It’s one thing to get answers to questions and it’s another to know what questions to ask in the first place. We need to show them novels ways that they can enhance the games to increase customer retention.”

Zynga is already an analytics powerhouse, but it sees an infinite opportunity to leverage the terabytes of data it collects daily to enhance the gaming experience of its customers. “My goal for the year is to use analytics to come up with new product innovations,” says Rudin. By proactively working with the business to improve core products, the analytics team is fast becoming an ideas factory to improve Zynga’s profitability.

Editor's Note: By the way, the Zynga Analytics team is growing as fast as the company, so if you’re interested in talking to them, please contact Ken at [email protected].

Posted by Wayne Eckerson on February 2, 20100 comments


Revolutionary BI: When Agile is Not Fast Enough

Developers of BI unite! It is time that we liberate the means of BI production from our industrial past.

Too many BI teams are shackled by outdated modes of industrial organization. In our quest for efficiency, we’ve created rigid, fiefdoms of specialization that have hijacked the development process (and frankly, sucked all the enjoyment out of it as well.)

We’ve created an insidious assembly line in which business specialists document user requirements that they throw over the wall to data management specialists who create data models that they throw over the wall to data acquisition specialists who capture and transform data that they throw over the wall to reporting specialists who create reports for end users that they throw over the wall to a support team who helps users understand and troubleshoot reports.The distance from user need to fulfillment is longer than Odysseus' journey home from Troy and just as fraught with peril.


Flattened BI Teams

Contrary to standard beliefs, linear development based on specialization is highly inefficient. “Coordination [between BI groups] was killing us,” says Eric Colson, director of BI at Netflix. Colson inherited an industrialized BI team set up by managers who came from a banking environment. The first thing Colson did when he inherited the job was tear down the walls and cross-train everyone on the BI staff. “ Everyone now can handle the entire stack--from requirements to database to ETL to BI tools.”

Likewise, the data warehousing team at the University of Illinois found its project backlog growing bigger each year until it reorganized itself into nine small, self-governing interdisciplinary groups. By cross-training its staff and giving members the ability to switch groups every year, the data warehousing team doubled the number of projects it handles with the same staff.

The Power of One

Going one step further, Colson believes that even small teams are too slow. “What some people call agile is actually quite slow.” Colson believes that one developer trained in all facets of a BI stack can work faster and more effectively than a team. For example, it’s easier and quicker for one person to decide whether to apply a calculation in the ETL or BI layer than a small team, he says.

Furthermore, Colson doesn’t believe in requirements documents or quality assurance (QA) testing. He disbanded those groups when he took charge. He believes developers should work directly with users, which is something I posited in a recent blog titled the Principle of Proximity. And he thinks QA testing actually lowers quality because it relieves developers from having to understand the context of the data with which they are working.

It’s safe to say that Colson is not afraid to shake up the establishment. He admits, however, that his approach may not work everywhere: Netflix is a dynamic environment where source systems change daily so flexibility and fluidity are keys to BI success. He also reports directly to the CEO and has strong support as long as he delivers results.

Both the University of Illinois and Netflix have discovered that agility comes from a flexible organizational model and versatile individuals who have the skills and inclination to deliver complete solutions. They are BI revolutionaries who have successfully unshackled their BI organizations from the bondage of industrial era organizational models and assembly line development processes.

Posted by Wayne Eckerson on January 27, 20100 comments


The Next Wave in BI: Decision Analysis

We’ve come a long way in business intelligence, but there are still plenty of miles to travel. We’ve gone through three distinct eras: Data Warehousing, Business Intelligence, and Performance Management. I think the next era is Decision Analysis.

In the 1990s, we focused on building repositories of integrated, historical data (i.e., the era of Data Warehousing); in the late 1990s and early 2000s, we focused on tools for reporting and analyzing information in our data warehouses (i.e., the era of Business Intelligence); in the late 2000s, we focused on using information to improve performance by monitoring key performance indicators (i.e., the era of Performance Management.)

The next decade will focus on improving the way we make decisions. There is a lot to say here, and I haven’t completely formulated all my thoughts, but this era will take a long time to bear fruit because it involves understanding how the human mind processes information and how people interact in social groups to make decisions. To take BI to the next level, we need better insights into human behavior and perception. In other words, it’s time to recruit psychologists onto our BI teams.

In 2010, you will see the first fruits of the era of Decision Analysis. Specifically, you’ll see more robust collaborative capabilities embedded within BI tools and the first attempts to deliver formalized methods for evaluating the effectiveness of decisions made with those tools.

Collaboration

Most leading BI vendors are applying social media conventions to their toolsets to improve collaboration and decision making. For example, the online-based reporting service, Swivel, lets users rate and comment on charts published online by themselves or others. Following the lead of Facebook, LinkedIn, and other social media sites, some BI vendors will let you “follow” people whose analytical skills you admire and be alerted when they publish a new report. BI vendors will also beef up their guided analytics capabilities, enabling users to review the steps that a trusted analyst took to create a great report using a macro-based replay function. And expect every BI vendor to offer some form of annotation, threaded discussions, and tighter integration with email.

We’ll also see a host of new independent collaboration platforms that could provide the glue to link people, process, and documents in more seamless, transparent ways and improve decision making. For example, SAP is working on an online collaborative environment call 12Sprints that provides templates for specific types of collaboration activities. And Google recently debuted Google Wave, its latest collaborative environment that lets groups engage in seamless instantaneous conversations. Of course, many companies already use Skype, Google Docs, Google Groups, Facebook, and Web conferencing systems, such as GoToMeeting, to foster formal and impromptu collaboration. These incumbents will slowly become more formally integrated with BI tools and decision making processes.

Decision Governance

Although many BI teams do a great job monitoring BI usage, most have done little to nothing to monitor and evaluate the effectiveness of what users do with information they give them. We need to begin tracking the decisions that users make with BI tools and measure the effectiveness of those decisions against business goals and plans. We need to start studying the decision making process and apply procedures to increase the probability that users will correctly interpret the data and take appropriate actions. We can only do this by applying the same types of feedback loops we’ve applied to our BI systems themselves.

A terrorist’s attempt to blow up Northwest Airlines flight 253 last month revealed some fatal flaws in our country’s intelligence gathering activities, including a lack of coordination and information sharing among agencies. But another intransigent problem, it turns out, is the faulty assumptions that analysts apply to evidence and the lack of organizational controls for testing and challenging those assumptions.

A recent article in the Boston Globe called, “Think Different, CIA” provides some instructive lessons for companies using BI tools to make decisions. The article describes a phenomenon that psychologists call “premature cognitive closure” to explain how humans in general, and intelligence analysts in particular, can get trapped by false assumptions, which can lead to massive intelligence failures. It turns out that humans over the course of eons have become great at filtering lots of data quickly to make sense of a situation. Unfortunately, those filters often blind us to additional evidence--or its absence--that would disprove our initial judgment or “theory.” In other words, humans rush to judgment and are blinded by biases. Of course, we all know this, but rarely do organizations implement policies and procedures to safeguard against such behaviors and prevent people from making poor decisions.

The Next Wave

To take BI to the next level, we need to provide a collaborative environment to improve decision making and evaluate the effectiveness of decisions on a continuous basis. We need to establish processes and procedures to ensure people and teams properly interpret the data, identify and challenge each other’s assumptions, and keep an open mind about the drivers of business activity. By applying collaboration and governance to decision making, we can help our companies get even more value from their BI dollars. This really is the next wave of BI.

Posted by Wayne Eckerson on January 19, 20100 comments


Seven Options for Better Requirements

I was asked recently, “What are the latest techniques for gathering business requirements?”

This is a loaded question: although there are dozens of techniques for gathering requirements, the most effective one is not to have to gather them at all. That is, your BI team has so much experience in your company and industry that it already knows what users are going to ask. And through continuous dialogue with the business--via both formal and informal channels--the team already knows the business processes and supporting data inside and out and has built much of the infrastructure needed to support current and future requirements. Beautiful.

Ok, I know, I know, I know….. Just anticipating what users want is not enough to build a solution. You still need to gather, verify, and document requirements and prototype a solution to ensure that you're building the right thing. It’s important to complement innate knowledge of the business with tried and true techniques for gathering requirements.

So that being said, here are seven of my favorite requirements gathering techniques. (And I'd love to hear your favorites!):

1. Interviews. The old standby is the one-on-one interview. And if you have good listening and synthesizing skills, there is nothing better. Send out your interview questions in advance so people can think about their answers before you get there; you’ll get better, more thoughtful responses. And make sure you ask the right questions. Don't ask, "What data do you need?" or "What do you want the report to look like?" Rather, ask: "What are you trying to accomplish?" "What actions will you take if you have this information?" "What are your goals?" "What are your incentives?" Finally, make sure you interview above and across the group you are working for as well the folks in the group itself to get the "global" perspective.

2. Joint application design (JAD) - It's best to supplement individual interviews with a group session, especially to present the findings from the individual interviews and get feedback so you can refine the requirements.

3. Surveys. Believe it or not, surveys are a great way to capture solid feedback. In an interview, most users are put on the spot (unless they dutifully prepared for the interview.) A survey gives them time to look at the questions and think about their answers before they submit them. You can also survey a lot more users than you can interview. Of course, the downside is that some people may never return the survey, in which case you need to schedule a one-on-one interview with them anyway. Always follow survey with a joint design session to get feedback from the group.

4. Process mapping. Ultimately, a BI solution is a vehicle to shed light on one or more business processes. So find out what those processes are. Understand the flow of data through each process, and how the availability of data can streamline them. Be careful though, many users are wary of process reengineering sessions where you map workflows with stickies. Asking "day in the life of…" questions might be a better approach. Ultimately, BI developers with years of experience in the business or industry already know these processes and the data that supports them.

5. Reverse engineering. You can probably obtain 60% to 80% of the metrics, attributes, and dimensions for a new report or dashboard in existing reports or operational systems. There is often no need to recreate wheel, just extend it a bit. Take the time to examine what people already use but don't let those forms bias what you produce, either.

6. Prototypes. A great way to refine your requirements is by creating a prototype based on actual data if possible (fictional data if not) so users can see and interact with a system before it goes into production. A picture is worth a thousand words. Once users "see" the application, they'll know immediately if it meets their requirements or not. Just make sure they understand it's a prototype not a production system.

7. Automation Tools. There are tools that can assist in requirements definition. Designers have long tried to snare unsuspecting users into helping them refine entity-relationship models, but these are too intimidating for most users. Some vendors, such as Kalido, use higher-level conceptual models that make it easier for developers to conduct a dialogue with users. The models can then automatically generate schema and transformation code. Another vendor, Balanced Insight, provides a tool that lets users define dimensions, attributes, and metrics in plain English and then vote on the alternatives (hence the name "Balanced Consensus"). Like Kalido, it can generate star schema and semantic layers for several leading BI tools. Other data mart automation tools in this genre include Wherescape and BI Ready.

What are your favorite techniques for gathering requirements? What gotchas lay on the road to perfect designs? I’d love to know!

Posted by Wayne Eckerson on January 17, 20100 comments


Rethinking the Data Dictionary

We know the internet has transformed the encyclopedia. It’s now transforming the dictionary. For those of us who create and maintain data dictionaries, the implications are worth considering.

Wikipedia unshackled the encyclopedia from its paper-based, expert-centric, retail publishing heritage. Now anyone can contribute an entry or suggest edits and anyone with a Web browser and internet connection can use Wikipedia free of charge. This free-spirited collaborative approach to gathering and synthesizing human information has created a resource that dwarfs any traditional encyclopedia. Today, more than 85,000 active contributors have worked on 14 million articles in 260 languages.

Erin McKean, a lexicographer, is now trying to do the same for dictionaries. A former editor of New Oxford American Dictionary, McKean is co-founder of a new online dictionary called Wordnik that leverages the internet to redefine what a dictionary is and how it works.

First, McKean doesn’t believe a dictionary should be the final arbiter of words but rather a collector. To that end, Wordnik encourages people to submit new words to the online dictionary or add new definitions to existing words. Second, and most importantly for our discussion, she believes words only have real meaning in context. Therefore, her dictionary not only publishes standard definitions (from traditional dictionaries), including synonyms and antonyms, but adds a host of contextual information to make the words come to life.

Contextual Information

For instance, when you type the word “sough” (meaning a soft murmuring sound) you currently see 50 examples of how the word is used in sentences that have appeared in books or articles. You can also hear it pronounced (courtesy of American Heritage dictionary), and you can read detailed etymologies of the word’s origins. You can also see words that aren't synonyms or antonyms but show up in the same sentence and provide valuable clues about its meaning. For example, words related to sough are whiporwill, washing tub, and grooving.

Beyond soft context, Wordnik provides quantitative data. You can see a bubble chart that shows how much a word has been used every year going back to 1800 as well as statistics about punctuation applied to the word. Wordnik also links to Flikr images and Tweets that contain the word so people can see how it is being used in modern day parlance. People also can tag the word or add personal comments.

This rich contextual information turns the dictionary from a sterile arbiter of meaning to a sensuous, multidimensional, exploration of culture and history through the vehicle of words. And perhaps best of all, it gets people excited about words.

Implications

So, what is the implication for our lowly data dictionaries?

I should point out that our data dictionaries have to be precise, even moreso than traditional dictionaries where a single word like “set” can have 33 definitions. Our terms can only have one precise meaning. They are the semantic gold-standard for our organizations, the ultimate arbiter of meaning and the basis for our shared vocabulary and language.

But does that mean our data dictionaries have to be dry, static, appendages to our BI environments? Of course not. In fact, if we take a cue from McKean, we can transform the data dictionary into an active agent of organizational and cultural knowledge, which is something that’s been missing from our BI and data governance programs.

Rethinking the Data Dictionary. Think about it. When we define a data element (metric or dimension), let’s show related data elements and link to their definition pages. Let’s encourage people to rate the element and comment on what they like or don’t like about it and how it can be improved. Data owners and stewards can moderate these online discussions and track the ratings. Let’s also encourage people (both business and IT) to suggest new elements to add to the dictionary and provide definitions and contextual information.

In addition, let’s display statistics of the number of reports in which the element appears and who uses those reports the most. And then let’s link directly from the dictionary to reports or dashboards that display the element so people can see how it’s used in context. The more context we provide to the definitions and descriptions in our data dictionaries, the more useful and used they will become.

Wikis. Now you might be wondering how to incorporate all this context into a lowly data dictionary. Let’s take a cue from Wikipedia and use a wiki. In fact, many BI teams are already experimenting with wikis to collect metadata, foster collaboration, and improve communications.

Sean van der Linden of AT&T Interactive delivered a presentation last year at a TDWI BI Executive Summit in which he described Wiki templates his BI teams uses to describe/define operational data sources, business processes, data elements, and reports and showed how they could be used to facilitate requirements gathering, project management, and governance processes.

As we all know, it’s a Herculean feat to create standard definitions for key data elements. But once you do, rather than publishing static descriptions of these data building blocks, consider following McKean’s example and create an interactive metadata environment that provides rich context and a collaborative environment to enhance communication.

Tell me what you think!

Posted by Wayne Eckerson on January 13, 20100 comments


Principle of Proximity: THE Best Practice in BI

After 15 years in the business intelligence industry, I’ve hit the mother lode: I’ve discovered the true secret to BI success. It’s really quite simple, and it’s been staring at us for years. It’s the principle of proximity.

By proximity, I mean seating your BI developers next to your business experts. Not just in a joint-application design session, a requirements interview, or scrum stand-up, but ALL THE TIME! Make them work side by side, elbow to elbow, nose to nose. It doesn’t work to merely locate them on the same campus or in the same building. You need to put them in the same cubicle block, or better yet, in one big room with no walls so everyone can see, hear, smell, and touch everyone else all the time. Radical, but effective.

And don’t mistake me: I’m not talking about business requirements analysts--I’m talking about developers who write the code and design the models. Yes, make the developers get the requirements right from the horse’s mouth. Don’t force them to learn requirements second hand through a business requirements analyst. Trust me, something always gets lost in translation.

To develop awesome BI applications, you have to function like a small start up where there are no departments or organizational boundaries, no separatejargon or incentives, no separate managers or objectives, and NO WALLS. Just one big, messy, energetic, on-the-same-wavelength family that gets things done. And fast.

Role of Agile. I like agile software development methods. They come as close as any methodology to approximating the principle of proximity. If nothing else, go agile. Create a small team of business and technical people and make them do stand-up meetings daily, if not hourly! And hold them jointly accountable for the outcome.

But as good as agile can be, proximity is better. Why? When you place developers and business experts in the same room, they almost don’t need to talk. They absorb what they need to know by osmosis, and they learn to respect what each group needs to do to succeed. And fewer meetings make happier, more productive people.

Several years ago, Wes Flores, a technology manager at Verizon, told me the secret of his group’s success: “We sit side by side with business people and report into the same leadership. The only difference is that we specialize in the data and they specialize in the business process.”

So if you want to succeed at BI, reassign your business requirements analysts and immerse your BI developers in the physical heart of the business by applying the principle of proximity.

Posted by Wayne Eckerson on January 7, 20100 comments


BI Jobs of Today... and Tomorrow

It is no secret that many business intelligence jobs are getting outsourced to low-cost centers around the world. In general, these are programming tasks that don’t require direct interaction with customers: ETL programming, testing, and some report development. As offshoring increases, we have to ask, “What are the BI jobs of the future and who will fill them?”

Drive the Business. A column by Thomas L. Friedman wrote a column for the New York Times this fall (October 21, 2009) that sheds some light on the issue:

“A Washington lawyer friend recently told me about layoffs at his firm. I asked him who was getting axed. He said it was interesting: lawyers who were used to just showing up and having work handed to them were the first to go because with the bursting of the credit bubble, that flow of work isn’t there. But those who have the ability to imagine new services, new opportunities and new ways to recruit work were being retained. They are new untouchables.”

“That is the key to understanding our full education challenge today. Those who are waiting for this recession to end so someone can again hand them work could have a long wait. Those with the imagination to make themselves untouchables—to invent smarter ways to do old jobs, energy-saving ways to provide new services, new ways to attract old customers or new ways to combine existing technologies—will thrive.”

BI Imagineers. The good news for BI professionals is that it truly takes imagination to deliver an effective BI application. BI needs people who can stand between the business and technology and create solutions that anticipate user requirements for information, who help create and monitor metrics that drive performance, and help the business leverage information technology for competitive advantage.

Those individuals who can help their organizations harness technology for business gain will be in high demand and garner substantial salaries. But their knowledge can’t be gained easily—it takes years of working in the field applying technology to business problems before practical experience translates into imaginative solutions that drive the business. Imagination requires not just technical literacy—that’s just the ticket to play the game—but it takes deep knowledge of the business, its goals, strategy, people, and processes. Acquiring that knowledge takes time.

Apprenticeships Needed. I fear that if we offshore all the low-skill, entry-level BI jobs, people will never get the apprenticeships they need to become BI imagineers (to borrow a phrase from Disney.) How will people gain a foothold in the industry if there are no entry level jobs?

Jim Gallo, a consultant at Information Control Corp (ICC) in Columbus, Ohio wrote a provocative article on this topic for the BI Journal which I highly recommend. He writes, “It’s simple, really: Unless CIOs, CFOs, and CEOs make a commitment to provide opportunities to BI neophytes, we all run the risk that our BI organizations will cease to exist as strategic enablers within our own organizations.” (TDWI members can log in and read the article here.)

Gallo’s company has figured out an efficient and effective way to hire and train college graduates and make them productive members of an agile BI project team. In the article, Gallo discusses how these blended teams—which are comprised of three junior developers, a senior architect and a senior QA analyst—can compete cost-effectively against offshore BI players.

With such an apprenticeship, the junior developers on ICC’s agile teams are well on their way to becoming the BI leaders of tomorrow, garnering well-paying jobs. They are honing their technical skills by solving real customer problems under the guidance of senior BI architects and analysts. We need to make such opportunities available in our BI programs to create the imaginative leaders of tomorrow (if not today!)

Posted by Wayne Eckerson on December 28, 20090 comments


KPIs for Data Warehousing Managers

It’s funny how we rarely eat our own dogfood. Someone recently asked a question in TDWI’s LinkedIn group about the key performance indicators that data warehousing managers should use to measure BI/DW effectiveness. The first several respondents cited classic IT metrics, such as data transfer volume, data sizes, average ETL run times, average query response times, number of reports, number of users, etc. Then, a few people reminded everyone that KPIs, by definition, are “key”—typically, few in number, driven by top-level goals, easy to understand, and actionable.

The best KPIs drive dramatic improvements to the business. As such, there may only be one or two true KPIs per function (e.g. BI/DW) per level: operational, tactical, and strategic. Typically, each set of KPIs drives performance in the level above. Thus, operational KPIs in an operational dashboard drive tactical KPIs in a tactical dashboard which drive KPIs in a strategic dashboard. Each dashboard is used by different people at different levels of the BI/DW organization--from analysts and administrators to managers and architects to directors and sponsors--but the aligned KPIs ensure that everyone is working towards the same end.

For example, an operational dashboard measures processes in flight that enable ETL analysts, BI analysts, or database administrators to fix problems as they happen and optimize performance. At this level, KPIs for BI/DW might be: “The ETL job is spitting out a large number of errors.” “Query response times are slow.” Or “The number of users on the system is below normal.” In each case, an ETL developer, database administrator, or BI analyst respectively takes immediate action to investigate and fix the source of the problem.

A tactical dashboard measures departmental objectives and goals. In the BI/DW world, these goals are specified in service level agreements established with data warehousing customers. These might include DW availability (“The DW is refreshed at 8 a.m. week days 90% of the time”) and DW reliability (The “DW is operational 99.99% of the time between 8 a.m. and 5 p.m. Eastern time.”) These KPIs show whether the BI/DW team is on track to meet SLAs in a given time period. If not, the BI manager or architect needs to figure out why and take action  to meet SLAs before it's too late.

A strategic dashboard measures how well the BI/DW team is achieving its strategic objectives. Say for example, the BI/DW team’s mission is to “Empower business users with timely, reliable data that improves decisions.” A strategic KPI could be "the number of unique BI users each week who run a report," or the "ratio of active vs inactive users each week," or the "percentage of users rating BI as “critical to making quality decisions this period'” in a semi-annual survey. Lower than expected usage should cause BI directors and sponsors to understand what inhibits usability and take steps to improve it.

Sometimes, a single strategic KPI from the BI/DW team lands on the scorecard of senior executive. The executive uses the KPI to judge the effectiveness of the BI/DW resource and make funding decisions. Thus, it’s really important to think long and hard about that single KPI and how all your efforts are focused toward achieving positive results there. This can be done if your BI/DW KPIs are aligned so that lower-level KPIs drive higher-level ones and all are based on measurements that drive actions.

Let me know what you think!!

Posted by Wayne Eckerson on December 10, 20090 comments


Twitter versus LinkedIn

Among all the social media, I prefer LinkedIn.

I must admit, I was a skeptic at first. LinkedIn did a great job of getting me to their site because I received requests to connect from people I knew. How could I resist? With a single click, I was on the LinkedIn site. But then I thought, “You did a great job in getting me here but now there’s nothing to do!”

But that was until LinkedIn Groups came along. Last spring, TDWI formed a LinkedIn group and now it has more than 11,000 members. (See TDWI LinkedIn.) What’s great about this group is that people discuss and debate real issues of concern to BI professionals. I learn a lot from the discussions and have even incorporated comments from the site into research reports. If I have an interesting question or topic, I can usually get a half dozen or more responses within a day or two. This is a real community!

Twittering the Day Away. In contrast, Twitter makes me dizzy. I only follow 100 people but scanning through seemingly infinite posts, none related to another, many of which are cryptic, inane, or voyeuristic, is well… exhausting, annoying, and frustrating. How do people follow 600 people? Or 1,000? (And I won’t even discuss Facebook or MySpace, which I find totally useless from a professional perspective.)

To me, Twitter is more akin to an online party than an educational tool. When I’m on Twitter I feel like there’s a lot of loud background music, and you can hear tidbits of conversation here and there, with loud eruptions every once in a while when someone gets out of line. I think Twitter appeals to both socially gregarious people who are forced to spend most of their day tied to a computer and egoistic loners who now have a great excuse to share their inner dialogue with the rest of the world.

That being said, there are times when Twitter is quite handy. For instance, I find it valuable to hear what people are learning (in 144 characters!) at a conference or briefing that I wasn’t able to attend. It’s informed me when two thought leaders start feuding about one issue or another. In many ways, Twitter gives you the pulse of the industry. But keeping up with the industry via Twitter is almost a full-time job. And, I find it hard to Tweet and get any real work done.

So, while I haven’t turned off Twitter, I’ve decided to allocate most of my social media time to LinkedIn, which makes me more productive rather than less.

Monitoring Needed. Of course, it takes care and feeding to make our LinkedIn site effective. I’ve posted guidelines for usage, and every day, I have to purge the site of spammers. And once a month or so, I summarize and publish some of the more interesting or heated discussions for people who don’t have time to read through countless entries. Without such monitoring, a LinkedIn group becomes an irrelevant spamfest, useless except for job hunters. 

With such safeguards, however, LinkedIn is a wonderful, educational community, a perfect adjunct to TDWI's physical conferences. See you online!

Posted by Wayne Eckerson on December 1, 20090 comments


Succeeding with SaaS

I’m hearing a lot success stories about deploying BI in the cloud. I believe these stories are just the tip of the iceberg.

Last week, I delivered a Webcast with Ken Harris, CIO of Shaklee Corp., a 50-year old natural nutrition company, which has run its data warehouse in the cloud for the past four years. (Click here for the archived Webcast.) And this past year, ShareThis and RBC Wealth Management discussed their successful cloud-based BI solutions at TDWI’s BI Executive Summits this past year.

Adoption Trends

The adoption of the cloud for BI is where e-commerce was in the late 1990s: people had heard of e-commerce but laughed at the notion that a serious volume of transactions would ever take place over the wire. They said consumers would never embrace e-commerce for security reasons: people with network “sniffers” might steal their credit card numbers.

We all know how that story turned out. Today, e-commerce accounts for more than $130 billion in annual transactions, about 3.5% of total retail sales, according to the U.S. Census Bureau.

According to TDWI Research, a majority of BI professionals (51%) are either “somewhat familiar” or “very familiar” with cloud computing. About 15% of BI programs have deployed some aspect of their BI environment in the cloud today, but that percentage jumps to 53% in three years. And 7% said that either “half” or “most” of their BI solutions will run in the cloud within three years. That’s according to 183 respondents to a survey TDWI conducted at its November, 2009 conference in Orlando.

Shaklee Success

At Shaklee, the decision to move the BI environment to the cloud was a no brainer. Says CIO Harris: “It was an easy decision then, and it’s still a good one today.” Formerly a CIO at the Gap and Nike, Harris has many years of experience delivering IT and BI solutions so his word carries a lot of clout.

Harris said his team evaluated both on-premise and cloud-based solutions to replace a legacy data warehouse. They opted for a cloud-based solution from PivotLink when the vendor was able to run three of the team’s toughest queries in a three-week proof of concept. Once commissioned, PivotLink deployed the new global sales data warehouse in three months “versus 18 months” for an on-premises system, said Harris.

PivotLink “bore the brunt” of integrating data from Shaklee’s operational systems. Although Harris wouldn’t say how much data Shaklee moves to its data warehouse daily, he said the solution has spread organically and now encompasses all sales, cost, and marketing data for Shaklee, a medium-sized home-based retailer. And since Shaklee has no internal resources supporting the solution, “the cost savings are big” compared to an on-premise solution, Harris says.

SaaS Vendors. Harris was quick to point out that not all Software-as-a-Service vendors are created equally. Harris has used a number of SaaS vendors for various applications but not all have worked out and he has pulled some SaaS applications back in house. “You need a vendor that really understands SaaS, and doesn’t just put a veneer on an on-premise piece of software and business model.”

Harris added that the challenge of implementing a SaaS solution pales in comparison to the challenge of implementing BI. “SaaS is easy compared to the challenge of delivering an effective BI solution.”

Posted by Wayne Eckerson on November 23, 20090 comments