By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

RESEARCH & RESOURCES

Q&A: Don't Present Results, Tell Stories

All those fancy tools for charting and graphing won't help you unless you know your audience and tell stories about a few key results.

With an ever-increasing choice data display options, too many people forget the purpose of charts and diagrams -- to help the audience understand the data. In this interview, we talk with Stanford scientist Jonathan Koomey, who has focused his career on energy conservation technology, economics and policy, as well as climate change solutions. As he explains during our conversation, he is also deeply interested in how analytic results are presented.

"The purpose of graphing is not to display technical virtuosity," Koomey says. "People think they're supposed to be using these fancy tools in fancy ways, but it's not about showing how clever you are, it's about giving insight." In this interview, he describes some of the biggest mistakes analysts make both in analyzing and presenting data, and how to avoid those errors.

Koomey is the author or coauthor of eight books, including "Turning Numbers into Knowledge: Mastering the Art of Problem Solving," which now has over 30,000 copies in print in English, Chinese, Italian, and (soon) Korean. He is currently a consulting professor at Stanford University, and spoke at a TDWI Webinar in November on Effective Presentation of Analytical Results.

BI This Week: Your career is focused on energy and resource conservation. Yet you write and speak fairly often on analytics and the presentation of analytic results. How do those two areas overlap?

Jonathan Koomey: The original purpose in writing my book, Turning Numbers into Knowledge, was to have something to give to smart young scientists I was hiring. I kept having to explain the basics of presentations to them -- how to make a good table or graph, what constitutes adequate documentation, and how to evaluate claims made by other people. I realized that these lessons really should be written down. So now, anytime I work with someone, I give them a copy of the book and say, "Read this and you'll know what I'm expecting."

Even bright young scientists struggle with presenting data effectively to an audience?

That's absolutely true. One reason is that these skills aren't taught in universities very often. I've mostly learned by trial, error, and experiment. I hope that by reading my book, others can learn these lessons more quickly.

Do these rules apply to writing as well as speaking?

I think people have difficulty with both. There are some differences in writing versus speaking presentations, but many of the same lessons apply. Focusing on and understanding your audience is one of the key lessons, and that applies regardless of the medium.

Presenters get focused on the details and they forget that explaining something to others shouldn't necessarily follow the same path as figuring out the information in the first place. Analysts also often forget that other people don't care as much as they themselves do about what they're doing, so they need to summarize the key results in compelling ways.

In your TDWI Webinar, you emphasized the pyramid concept of presentations, which journalists have used for many years. Can you recap that explanation?

The biggest mistake analysts make is trying to explain things in much more detail than the person listening really has the time or interest for. People who are more technical are really into the details, which is a good thing. They forget in the end, however, that generating information is not the same thing as conveying it in a compelling way. People can really absorb only a few key points from your paper or talk, so you need to choose the key points that you want to convey. Don't overwhelm them with details.

That's where the idea of the pyramid comes in; it's originally from a book by Barbara Minto called The Minto Pyramid Principle. The basic notion is that for any big conclusion there are many supporting facts. You can think of the big conclusion as the point at the top of the pyramid, and the supporting facts as gradually expanding outward toward the bottom. There are many more detailed facts at the bottom, which build upward to the summary point. The very top of the pyramid is supported by all those facts.

The technical people developing the many facts that make up the base of the pyramid are doing important work. They need to do that carefully and make sure they're doing it right, but at the end of the day, they shouldn't start by explaining all those details, because that's not what is important to the executive or decision-maker. These busy people just want to know the summary.

A good example of how this works in practice is a newspaper article. Journalists start an article with the key point first, then the second most important point, and the third most important point, and so on. Gradually, they fill out all the details at the base of the pyramid, but they start with the top of the pyramid because that's when the most people are reading.

Is that the biggest mistake you see people make -- failing to present the most important point first?

It's a general problem with presentations and writing. Technical people, by their nature, get focused on the details. That's good. We want that. We want those details to be carefully worked out. [However,] communicating that information takes a different approach in order to be effective. If you can give people a few key points, they're going to find your analysis more compelling, and they'll ask you for more details if they want them. You'll make your listeners a lot happier. No one ever complains about a presentation that ends early or a paper that's quite succinct.

In your presentation, you also talked about the importance of documentation. Just how important is it not only to have good data, but to be able to defend your data to an audience?

Documentation has multiple purposes, one of which, as you say, is to defend your conclusions with adequate and clear descriptions of what you did, how you did it, the data sources you used, and the people you relied on to conduct the analysis.

You also need documentation for your own internal purposes. I find documentation helpful for checking my own work and making sure I'm doing it right. By not preparing documentation, which is often the norm, people are missing out on some corrective feedback that they could be getting to improve their own analysis.

I see people relegate documentation to a secondary role because it takes time and they say, "I don't have the time to write all that down." Guess what? If you actually took time to write things down, you'd add a direct and measurable benefit to your analysis. You might realize what you didn't do right. Furthermore, in six months or a year, what if you have to re-create your analysis -- or worse, what if the person who succeeded you has to re-create your analysis? They're stuck unless you've described clearly what you did.

It comes back to this idea in science of reproducibility. One of the key criteria for whether information is considered to be reliable is if independent people can reproduce what you did.

Is there a single egregious error that you see people making over and over in performing analysis or in displaying the results of analysis?

I'd like people to remember that doing analysis is about better decision-making. That's another important lesson from my work. Analysis that is done without a decision in mind is often not well focused and leads to wasted time and effort.

I divide analyses into two types: exploratory and more formal or structured analysis. The exploratory kind is, of course, where people do experiments and try to learn things in a relatively free-flowing way. In that case, you may not need to have a specific decision in mind.

For any kind of formal analysis project, however, the first question the analyst should ask the boss is, "What decision do you have to make?" Then work to truly understand as much as you can about what the different dimensions of that decision are, who's going to make it, who has to make changes to implement the decision, and what kind of data you have available to turn your analysis into a decision-relevant document. Always bring it back to the decision.

If you're doing analysis and suddenly you feel you're off track, ask yourself: "What decision will be made from this information?" Every time you ask about the decision, you'll be better able to do your analysis in a more focused way because you'll have a much clearer sense of what's important. You'll be able to just clear away certain things that are not relevant to that decision and move forward in doing your analysis.

You had some great examples in the Webcast of poor uses of charts and diagrams, along with ways to improve them. Can you share a few rules for avoiding some of those mistakes?

One of the big mistakes people make is to assume that because it's a default graph in Microsoft Excel, that it's somehow good practice. Personally, I love Excel. I've been using it since it came out on the Mac in 1985. It's a wonderful program, but in some cases, the graphing part of it encourages unfortunate behavior.

The big lesson is to avoid using those defaults, as Stephen Few points out. Instead, think carefully about the data you're presenting. As Edward Tufte says, "Above all else, show the data." Don't add extraneous ornamentation. Don't add three-dimensional bars or other graphical effects unless they actually contribute to understanding the data. For the most part, 3-D bar charts add no information. Instead, they take away from understanding, and make it much harder to see what the numbers show. There are two points here: One, don't use the defaults in Excel, and don't assume that those defaults present good graphing practice. Two, don't be tempted to add ornamentation that distracts from the data.

It helps, I think, to focus on the overall concept of telling a story. Don't just make the first graphs or tables that come to mind. Think about the story you're trying to tell and make graphs and tables that tell that story. Focusing on the storytelling is critical because it means you'll carefully choose which graphs to make, and you'll craft them in a way that will allow your readers or listeners to understand the key points that you want them to take away from the talk or the paper.

It's almost as if we have so much data today, and so many ways to display it, that we can't resist showing our audience way too much information.

Right. The purpose of graphing and writing is not to show how clever we are or to impress people with all the data we've collected. Instead, the sophisticated tools can help explain important insights that you've gained from your analysis, if we use them right. In the end, it's not about showing technical virtuosity -- it's about giving insight, telling stories, and better supporting decision making.

Further reading suggested by Jonathan Koomey:

Few, Stephen. 2004. Show Me the Numbers: Designing Tables and Graphs to Enlighten. Oakland, CA: Analytics Press.

Few, Stephen. 2009. Now You See It: Simple Visualization Techniques for Quantitative Analysis. Oakland, CA: Analytics Press.

Koomey, Jonathan. 2008. Turning Numbers into Knowledge: Mastering the Art of Problem Solving. 2d edition. Oakland, CA: Analytics Press.

Tufte, Edward R. 1995. The Visual Display of Quantitative Information. Cheshire, CT: Graphics Press.

Other titles of interest:

Norman, Donald A. 1990. The Design of Everyday Things. New York, NY: Doubleday/Currency.

Heath, Chip, and Dan Heath. 2007. Made to Stick: Why Some Ideas Survive and Others Die. New York, NY: Random House.

Huff, Darrell. 1993. How to Lie with Statistics. New York, NY: W. W. Norton & Co., Inc.

Hughes, William. 1997. Critical Thinking: An Introduction to the Basic Skills. Peterborough, Ontario: Broadview Press.

Wall Street Journal Number Guy (Carl Bialik)

TDWI Membership

Get immediate access to training discounts, video library, research, and more.

Find the right level of Membership for you.