TDWI Articles

Enable Deeper Understanding with Great Data Storytelling

How do you tell a good quantitative story? According to Angela Bassa, it's at once a straightforward enterprise and a methodological minefield.

How do you tell a good quantitative story? Is a good quantitative story more or less like a good regular story, albeit with, you know, numbers? Can quantitative stories inform, persuade, inspire, and entertain listeners? How do quantitative stories, good and bad, differ from regular stories?

These are the kinds of questions that keep Angela Bassa up at night. Bassa, director of data science with iRobot, will be talking about data science and quantitative storytelling at TDWI's upcoming Accelerate conference, which will be held in Boston in early April. TDWI bills Accelerate as "the leading conference for analytics and data science training," and offers deep-dive tutorials, networking opportunities, and presentations and keynotes from luminaries such as Bassa, Michael Li, Claudia Perlich, and Eduardo de la Rubia.

The Importance of Quantitative Storytelling

Bassa's talk on quantitative storytelling should be one of the highlights. As she sees it, telling a good quantitative story is at once a straightforward enterprise -- quantitative stories, like their regular counterparts, have a beginning, middle, and end, containing rising action, falling action, and a climax -- and a methodological minefield.

For Further Reading:

How to Find a Story in Data

Three Practical Ways of Narrating Your BI

Data Stories: Best Practices for Charts, Data Journalism, and Visualizations

"A huge part of why people do data science is to communicate or persuade, and in trying to [convey] the results of an analysis, that's sometimes where things fall apart," she says, noting that Super Bowl LI -- in which the Atlanta Falcons leaped out to a 28-3 lead over the New England Patriots -- offers a perfect illustration of what she means.

"If you looked at a win probability chart late in the second quarter, the Falcons were overwhelming favorites to win. Toward the end of the game, at one point, the win probability estimate had the Falcons winning with 99 percent probability. However, win probabilities are not predictions. A 99 percent win probability is not a prediction the Falcons are going to win. It's based on a simulation of the game up to that point," she explains.

"Basically it says, if we could play the rest of the game 1,000 times, the Falcons would win 990 of those times. We do a poor job of educating people how to interpret these tools, so they become these alchemical things that folks just look at and gloss over. They think 'This is too hard -- I can never understand it.' We not only can do better, we must do better."

Guiding the Audience's Understanding

According to Bassa, quantitative storytelling is as much about guided interpretation as anything else.

"You have to guide the audience toward a sort of intuitive understanding of an analysis. Fundamentally, it's a communications issue. Once you've done all the hard work of putting together your data and analyzing it and modifying it, then you have to communicate it. How do you communicate a quantitative story in a way that humans are shaped to interpret it?"

That's the big challenge, Bassa notes. After all, millions of years of evolution have honed human beings to think and understand deterministically. Causality is a great example of this. We're predisposed to assume causality.

"The risk when you're telling a quantitative story and you're building a narrative is building in terms of causality -- 'because of this then that.' You're ascribing a causality that maybe isn't true. It's one of those natural biases we have. We see an effect and we go 'This happened because that.' Using 'because' is really dangerous because data doesn't really tell you a 'because.'"

Practice Makes Perfect

In the final analysis, learning to tell a good quantitative story is a lot like learning a new language, she argues: it takes practice. "To learn a new language, you have to build fluency. If you don't practice, you become less fluent. The same thing is true of numeracy or mathematical literacy: if you don't practice that muscle memory, it's very easy to fall back into natural patterns of thinking."

Part of building up muscle memory is being alert to the all-too-human biases -- such as spurious causation -- that make us who we are. These biases have served us for hundreds of thousands of years, and Bassa says they are baked into anything we as human beings build or create. This includes not just algorithms, models, or (at a higher level) sensors and signalers, but data itself.

"Data comes from sensors, but your sensors might be biased. Your collection mechanisms might be biased. Just because it is a quantitative story doesn't mean it's an objective story. There's always a subjective story because of all the human fingers that have touched it," she points out.

About the Author

Stephen Swoyer is a technology writer with 20 years of experience. His writing has focused on business intelligence, data warehousing, and analytics for almost 15 years. Swoyer has an abiding interest in tech, but he’s particularly intrigued by the thorny people and process problems technology vendors never, ever want to talk about. You can contact him at [email protected].


TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.