Data-Driven Brexit: A Wakeup Call for Analysts
The field of analytics has failed to grasp the reality that data is not reality.
- By Barry Devlin
- June 28, 2016
The morning of Friday, June 24 saw red on the markets. Mainstream politicians were horrified. The great British public had delivered their verdict and it came as a shock. Why? Wasn’t the polling data clear enough in the weeks before the vote? Hadn’t the British people -- and, indeed, the world at large -- been shown all the facts needed to make rational, thoughtful decisions?
There are significant lessons for believers in data-driven business to learn from how data was and wasn’t used for decision making before, during, and after the Brexit vote.
Consider first the shock and horror expressed on Friday morning. Polls over the previous weeks were broadly united in a result too close to call. Most of the minimal trending seen tended toward a Leave result. Yet, as Anatole Kaletsky, economist and author of Capitalism 4.0, The Birth of a New Economy, wrote a week before the vote: “Analysts and investors have consistently assigned low odds to insurgent victories: in late May, betting markets and computerized models put the probabilities of Trump’s election and of Brexit at only around 25%, despite the fact that opinion polls showed almost 50% support for both.”
One might expect that analysts and investors would be among the leaders in using analytics to forecast future outcomes. Their post-referendum reaction suggests otherwise.
Since the result has been announced, analytics has been to the fore, showing demographic breakdowns of voting patterns versus age, geography, level of education, types of employment, and more. Graphics from The Guardian, for example, show various levels of correlation. The best predictor for a Remain vote was to have a degree. In comparison, age was a much poorer predictor, but is the relationship that received most coverage, with many young people quoted as feeling let down by their elders or sacrificed for outmoded thinking. Causation has received minimal attention and even the most telling predictor for voting, higher education, has been largely ignored.
How did these data “usage” patterns emerge? What do they say about the use of data in society?
Data (or more correctly information) was used by both sides of the Brexit debate to justify their stances. It has since emerged that at least some of it was mis- or disinformation. To mention but one example, a claimed contribution of £350m per week to the EU has since been shown to be approximately double the real net amount. More disturbingly, predictive economic information supplied to the public by such presumably competent and reputable experts as the UK Statistics Authority and International Monetary Fund among others was dismissed with the memorable put-down: “I think people in this country have had enough of experts.”
Herein lie perhaps the most important lessons of Brexit for those of us in the analytics field. Data is not sacrosanct. Information will only be accepted when it conforms to preconceived notions. Expertise is not sufficient and, in extremis, will be dismissed with ridicule. Boris Johnson’s memorable putdown of the evidence offered by the Remain campaign as “most extraordinary avalanche of scaremongering, a sort of Himalayan snow job of statistics” sums up the type of emotive, political reaction to disputed data often encountered. I suspect that many of us have encountered similar reactions in corporate boardrooms.
Whether in the phrase data driven, the concept of big data, or the original data warehouse, the industry now known as analytics has failed to grasp the reality that data is not reality. It does not tell a story, does not make a case, and emphatically does not carry an argument. As I’ve discussed in “How Do You Make Decisions? (Part 4),” human attitude -- including emotion, intuition, and social empathy -- and motivation are at the heart of decision making and the action that follows.
The value of Brexit, should we choose to see it, is as a wakeup call to our industry -- and society in general -- to reexamine our belief system about the value of data. Whether in our trust in polls, our belief in experts, our credence in the press or other information sources, or our faith (or otherwise!) in politics, the process and outcome of this debate must surely cause us to explore better ways of reaching decisions than our current soundbite-driven process.
About the Author
Dr. Barry Devlin is among the foremost authorities on business insight and one of the founders of data warehousing in 1988. With over 40 years of IT experience, including 20 years with IBM as a Distinguished Engineer, he is a widely respected analyst, consultant, lecturer, and author of “Data Warehouse -- from Architecture to Implementation" and "Business unIntelligence--Insight and Innovation beyond Analytics and Big Data" as well as numerous white papers. As founder and principal of 9sight Consulting, Devlin develops new architectural models and provides international, strategic thought leadership from Cornwall. His latest book, "Cloud Data Warehousing, Volume I: Architecting Data Warehouse, Lakehouse, Mesh, and Fabric," is now available.