By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Articles

Where is Cognitive Decision Making in BI? (Part 1 of 3)

Cognitive computing brings together a wide range of disciplines and technologies to address complex situations filled with ambiguity and uncertainty. With so much to recommend it, why isn't it more ubiquitous in enterprise decision making?

On 15 March 2016, Google's artificial intelligence (AI) AlphaGo program was awarded honorary ninth-dan rank, having defeated world champion Lee Se-dol 4 games to 1 in Seoul. According to South Korea's Go Association, this rank indicates its ability reached a level "close to the territory of divinity." AI experts and commentators have placed the program on a similar plane.

As recently as last January, Google announced that AlphaGo had beaten Fan Hui, the reigning three-time European Go champion, 5 games to 0. To quote University of Cambridge Prof Zoubin Ghahramani: "This is certainly a major breakthrough for AI, with wider implications. The technical idea that underlies it is the idea of reinforcement learning -- getting computers to learn to improve their behaviour to achieve goals. That could be used for decision-making problems -- to help doctors make treatment plans, for example, in businesses or anywhere where you'd like to have computers assist humans in decision making."

The question that arises is: how does having "computers assist humans in decision making" relate to traditional BI and analytics?

First, I will spread the net a little wider than artificial intelligence. Cognitive computing is a cross-disciplinary field bringing together AI, neural networks, deep or reinforcement learning, natural human interaction, as well as the ability to sense and affect the real world, all of which are vital for decision making. It's an emerging field, so some definitions may be of value.

In "Computing, Cognition and the Future of nowing," IBM's John Kelly proposes that cognitive systems "learn at scale, reason with purpose, and interact with humans naturally ... generate not just answers to numerical problems, but hypotheses, reasoned arguments, and recommendations about more complex -- and meaningful -- bodies of data."

The Cognitive Computing Consortium also provides an excellent, if long, definition. Here are its key ideas. "Cognitive computing makes a new class of problems computable. It addresses complex situations that are characterized by ambiguity and uncertainty. ... In these dynamic, information-rich, and shifting situations, data tends to change frequently, and it is often conflicting. The goals of users evolve as they learn more and redefine their objectives. To respond to the fluid nature of users' understanding of their problems, the cognitive computing system offers a synthesis not just of information sources but of influences, contexts, and insights. To do this, systems often need to weigh conflicting evidence and suggest an answer that is 'best' rather than 'right.' Cognitive computing systems make context computable."

Although these definitions may seem extremely broad, the reality is that cognitive computing brings together a wide range of disciplines and technologies. The result is that stories about cognitive computing are appearing in a variety of places, including image classifiers (of everything from cats to cancerous cells), autonomous cars and drones, and virtual assistants embedded in smartphones, browsers, and the intriguingly named Amazon Echo. All these applications work in a fundamentally similar fashion: ingest a large volume of information (not data) from the real world, apply self-learning algorithms to "understand" the problem, and use that understanding to affect changes in the real world.

Advances in each of these three areas have combined to drive the impressive growth spurt seen in cognitive computing in the last year or two.

First, the growth of "big data" on the Internet has provided an immense set of training information for AI systems. This soft information -- images, video, text, and audio -- is increasingly being supplemented by an even larger volume of semi-structured data from the Internet of Things. The extent and richness of all this information are what enable AI algorithms to learn.

Second is the rapid advance of neural networks that are at the heart of self-learning. Based on the structure and working of the human brain, the field has seen much theoretical progress in recent years. However, it is the vast growth in parallel computing power that has enabled many of the best known advances, including the AlphaGo example above.

The third growth area is in a combination of robotics and digitalization of business, both of which enable the output of the AI systems to directly affect the real world.

Consider these three steps in autonomous vehicles, for example. They operate by effectively sensing their surroundings, making decisions about how to drive, and then applying these decision through the vehicle mechanics. George Hotz built a self-driving car in his garage last year and demoed it to Bloomberg Business. The neural net software had "learned" to drive by "watching" him drive for ten hours on the highway. The car then drove as well as a teenage learner driver. Hotz would not (or maybe could not) fully explain how the software had achieved this.

From this example and others, it is clear that all aspects of cognitive computing are becoming more pervasive and affordable. It is equally clear that a cognitive computing system is viable and successful in real-time decision making. Applying this thinking to business decision making is the topic of Part 2 of this series.

About the Author

Dr. Barry Devlin is among the foremost authorities on business insight and one of the founders of data warehousing in 1988. With over 40 years of IT experience, including 20 years with IBM as a Distinguished Engineer, he is a widely respected analyst, consultant, lecturer, and author of “Data Warehouse -- from Architecture to Implementation" and "Business unIntelligence--Insight and Innovation beyond Analytics and Big Data" as well as numerous white papers. As founder and principal of 9sight Consulting, Devlin develops new architectural models and provides international, strategic thought leadership from Cornwall. His latest book, "Cloud Data Warehousing, Volume I: Architecting Data Warehouse, Lakehouse, Mesh, and Fabric," is now available.


TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.