Where is Cognitive Decision Making in BI? (Part 2 of 3)
Cognitive computing brings together a wide range of disciplines and technologies to address complex situations filled with ambiguity and uncertainty. With so much to recommend it, why isn't it more ubiquitous in enterprise decision making?
- By Barry Devlin
- March 30, 2016
Way back in early 2011, IBM Watson beat Jeopardy! champions Ken Jennings and Brad Rutter in a now famous TV challenge. In March of the same year, I wrote in a blog Tough Analytics? Watson to the Rescue, "It seems likely that over the next few years this combination of technologies [as deployed in IBM Watson] will empower business users to ask the sort of questions that they've always dreamed of, and perhaps haven't even dreamed of yet.
They will gain access, albeit indirectly, to a store of information far in excess of what any human mind can hope to amass in a lifetime. They will also receive answers based directly on the sum total of all that information, seeded by the expertise of renowned authorities in their respective fields and analyzed by highly structured and logic-based methods."
As I discussed in Part 1 , the technologies underpinning Watson have evolved in leaps and bounds and many more have been added in the intervening years. Artificial intelligence, deep learning, and cognitive computing have become part of the public discourse. What of the impact on business intelligence I predicted in 2011?
I believe I was overly optimistic in my prediction. Two recent posts suggest the BI market is perhaps overinvested in the old ways of doing BI. Independent analyst Mark Madsen, in a recent interview remarked, "Everything that we have been doing is an outgrowth of the old mainframe reporting mentality... Today, a BI tool still works like a decision support tool. You specify queries. You get the data back. You make it look pretty on the screen. But it is not substantively different from what we were doing 20 years ago."
I agree. Business users are, in the main, still asking the same old types of questions rather than exhibiting the exploratory behavior that cognitive computing could offer. Bill Inmon, who promotes the ingestion of "disambiguated text" into the data warehouse, writes, "[T]ext in [narrative] form cannot be easily used for analysis. ... [T]he computer still chokes on data when working with narrative data. ... There is sentence structure, for example, and terminology. There is the barrier of different languages. Most of all, there is context."
Indeed, most of all, there is context -- that old bugbear of business intelligence -- but I think Inmon misses the point. Context is becoming computable, as the Cognitive Computing Consortium definition says, but not in the old structured ways. Neural networks are rapidly learning to contextualize not just textual information but also audio and visual information, as shown by autonomous vehicles.
In the old world of BI, context has long been conflated with metadata, a term that has been misappropriated by everybody from ETL programmers to NSA spies. I coined the term context-setting information (or CSI, if you prefer) to emphasize what metadata is really supposed to do -- mainly for business users -- to provide the context needed to truly understand the information available and use that to offer insight for decision making.
Traditional BI assigns a separate and distinct identity to metadata, whereas context-setting information is an integral part of all information (unless it has been stripped out to leave the hard data that is required in relational and similar databases). Cognitive computing offers, instead, the ability to discover and use this context-setting information in situ.
Herein lies the challenge to BI practitioners and vendors, and may explain the slow uptake of machine learning and similar technologies in business decision making. In some sense, this is an alien land. Cognitive decision making begins to sound a bit like grokking -- Robert A. Heinlein's Martian word from Stranger in a Strange Land -- defined as "to understand so thoroughly that the observer becomes a part of the observed -- to merge, blend..."
However, there is another barrier to the uptake of cognitive computing in BI. Ironically, I believe this to be the fascination of the industry with data--currently big data and Internet of Things data -- rather than the actual use of information in moving from decision making to action taking. I'll take a look at that in Part 3.
Dr. Barry Devlin defined the first data warehouse architecture in 1985 and is among the world’s foremost authorities on BI, big data, and beyond. His 2013 book, Business unIntelligence, offers a new architecture for modern information use and management.