July 7, 2016
Feature Story
Why Cognitive Computing Is
So Important for Healthcare
Judith S. Hurwitz
President and CEO,
Hurwitz & Associates

The healthcare industry is complex, highly regulated, composed of legacy systems, and ripe for change and opportunity.

It is a large ecosystem that encompasses many different types of organizations, including:

• Government regulatory agencies

• Health information providers

• Healthcare payers

• Healthcare providers

• Independent research and testing laboratories

• Medical device manufacturers

• Pharmaceutical companies

The common denominator among all these players is that they create and manage a huge amount of data. Typically this data is managed in silos without any context or common metadata. However, this is no longer good enough. To drive better decisions, it is critical that an analyst be able to bring together data from across these areas. To make things even more complicated, data comes in many different forms, including structured data from medical tests and demographics and unstructured data from doctors’ notes, CT scans, and MRIs.

Why do we need to take a holistic view of data across these silos? It is clear that there is much that we can learn from unlocking the knowledge from the massive amount of healthcare data that already exists. One of the great challenges for the healthcare industry is the need to understand the patterns and anomalies hidden in data to improve treatments and discover new drugs. In addition, we need better insights to quickly understand new viruses and epidemics that may suddenly threaten millions.

Using a Cognitive Approach to Interpret and Learn from Data

Gaining a true understanding of treatment options requires bringing together data from multiple ecosystems in a consistent and holistic way. Cognitive systems can capture and integrate both unstructured data from medical journals and images with structured data systems from databases. It does little good to analyze these data elements in isolation. The value of a cognitive computing approach is that these data elements can be brought together into a corpus that addresses a specific problem. A corpus is the knowledge base of ingested data and is used to manage codified knowledge from a variety of related sources. Therefore the corpus is optimized to determine patterns and relationships between data elements and data sources. By applying this approach to healthcare, it is possible to gain insights and correlations that might not be apparent.

Applying Cognitive Computing in the Real World

It is often difficult for doctors to diagnose diseases when they lack experience or encounter symptoms they haven’t seen before. The solution to a patient’s problem is not always obvious, even after a battery of tests. Doctors are most efficient and successful when they already have the experience and knowledge to make sense out of a complex situation. What happens when an individual doctor lacks experience? The typical doctor with only a few years of clinical experience may come upon a patient with unfamiliar symptoms. The doctor may try to diagnose the issue based on limited experience. However, a majority of doctors will take the time to cull through recently published medical journals to find mentions of the symptoms. In other cases, that doctor will contact a specialist who may have seen these same symptoms hundreds if not thousands of times. This approach leaves too much to chance.

Hospitals and healthcare organizations are creating models based on data to capture the experience and expertise of seasoned physicians. For example, what does the data indicate about hospital readmissions or what infections are most common in a certain demographic over the last year? What are the patterns from hundreds of thousands of hospital records? What is the most up-to-date research from medical journals telling the hospital about new treatments and new threats?

A cognitive application built on a corpus of data that is constantly refreshed and updated with new information has the potential to provide a sophisticated tool to help professionals gain insights into data that would be out of their reach. Physicians can’t possibly read and analyze all of the new medical research that is published daily. A cognitive system that can both ingest a huge amount of unstructured data and make sense of the patterns and relationships can become an important tool in the diagnosis and management of illnesses.

What’s Next?

The ability to capture all of the available information and create a system that continues to morph and change is at the heart of a cognitive environment. Over the coming decade, the field of cognitive computing applied to healthcare will expand rapidly. Cognitive analytics will provide insights into new treatments and diagnoses of diseases. It will provide new approaches to help transfer the knowledge from the most experienced physicians to new doctors in record time. Combining the ability to train data based on human experiences with the ability to understand context and patterns will revolutionize medicine.

Judith S. Hurwitz is president and CEO of Hurwitz & Associates, a consulting, research, and analysis firm focused on emerging technology including big data and cognitive computing. Judith is a technology strategist, consultant, and thought leader. She is the author of Smart or Lucky? How Technology Leaders Turn Chance into Success (Jossey Bass, 2011), and the coauthor of Cognitive Computing and Big Data Analytics (Wiley, 2015) and six “For Dummies” books on big data and service management. A pioneer in anticipating technology innovation and adoption, she has served as a trusted adviser to many industry leaders over the years. She is a frequent speaker at conferences and a regular contributor to TDWI publications.

TDWI Onsite Education: Let TDWI Onsite Education partner with you on your analytics journey. TDWI Onsite helps you develop the skills to build the right foundation with the essentials that are fundamental to BI success. We bring the training directly to you—our instructors travel to your location and train your team. Explore the listing of TDWI Onsite courses and start building your foundation today.

 
Announcements
NEW Best Practices Report
Improving Data Preparation for Business Analytics
NEW Checklist Report
Gaining Business Value from Governed Analytics and Discovery
NEW Ten Mistakes to Avoid
In Data Storytelling
NEW Business Intelligence Journal
Business Intelligence Journal, Vol. 21, No. 2
NEW TDWI E-Book
Why Your Next Data Warehouse Should Be in the Cloud
NEW Checklist Report
Assuring the Quality of Operational Data
contents
Feature
Why Cognitive Computing Is So Important for Healthcare

Proven Tactics That Will Simplify Your Data Management Strategy
Feature
Operationalizing Analytics: Challenges

Mistake: Failure to Show Your Voice
Education & Events
TDWI Accelerate in Boston
The Westin Copley Place
July 18–20
Seminar in Dallas
Data Mining and Predictive Analytics

Renaissance Dallas Hotel
July 11–14
Seminar in Salt Lake City
Business Analytics

University Guest House & Conference Center
July 25–28
Webinars
Improving Data Preparation for Business Analytics
Thursday, July 14
Streaming Analytics for Real-Time Action – Best Practices for Getting Started
Tuesday, August 2
Faster BI for the Masses: How Search Can Make Analytics More Accessible
Thursday, August 4
Marketplace
TDWI Solutions Gateway
Informatica – Data Management for Next-Generation Analytics
TDWI White Paper Library
Embedded Reporting, Dashboards, and Analytics in On-Prem and SaaS Applications
TDWI White Paper Library
Why Machine Intelligence Is the Key to Solving the Data Integration Problem for the IIoT

Premium Member Discounts

Ready to take the CBIP Exams or attend our next conference? Take advantage of these exclusive member discounts.

$275

Discount

on TDWI Accelerate

$10

Discount

on CBIP Exam Guide

Flashpoint Insight
Proven Tactics That Will Simplify Your Data Management Strategy

Asking a business question these days immediately brings up a multitude of other questions that need answers first.

Where does the data to answer the question reside? Does it even exist? Maybe it’s stored in two or three different places. How will it get merged together? Should it be merged together? Is the data whole or is it missing key values? How will the answer wrap up into a story that the business questioner understands? How quickly can it be done? Ask any real data analyst and they will tell you these are just some of the questions that keep them up at night.

If one was to believe the majority of vendors out there these days, with the advent of “easy” visualization tools it takes no time at all to answer an important business question. Simply implementing a visualization tool on top of an existing tangle of systems—CRM/ERP back-ends, data marts, data warehouses, Web analytics, flat files, and on and on—can be a route to get quick answers. However, doing that does not change the key underlying challenges facing many companies, big or small: maintaining the integrity of their data as it moves across in-house applications and between different environments, the growth of analytical tools in the marketplace and choosing which one to invest in, and growing concerns about the complexity and redundancy of data repositories.

I have 20 years of experience working with data across many industries—including financial, retail, healthcare, marketing/PR, and nonprofit—from large corporate environments to small agile companies. I have seen the good, the bad, and the ugly of data management and I can honestly say that the only way a company will mature in its use of data as a means to drive its own business growth is by making sure the fundamentals of their own data management stay sound.

This article will outline the one major component that is fundamental to a successful data management strategy—an ongoing data audit. This will, in turn, successfully address and mitigate the underlying challenges each and every company has grappled with at one time or another: trust and accuracy of the data, understanding/communication of what data is available, and compliance and governance of existing data and implementation of new data collections.

Learn more: Read the entire article by downloading the Business Intelligence Journal (Vol. 21, No. 2).

 
TDWI Research SNapshot
Operationalizing Analytics: Challenges

Although more companies are starting to embed and operationalize analytics, respondents still face a number of challenges, many of which are related to people and processes rather than technology.

As one respondent put it, "IT, political agendas, and agreement on governance tend to be the largest barriers. Little is technical, mostly people related." These challenges are shown in Figure 7.

(Click for larger image)
Click to view larger

People Issues

Lack of trust in the data or the results tops the list. Often people fear what they don’t understand. If the people making embedded analytics part of their business process haven’t bought into it, it is likely that the implementation won’t succeed. Respondents cited concerns about displacing people and power shifts that staff fear might occur if the organization becomes more analytics-to-action driven. Of course, there is also a lack of trust in the data that goes into the analytics. Forty percent of respondents cited this as a challenge.

Lack of skilled personnel also ranks high. The lack of skilled personnel who can implement and even utilize embedded analytics also ranked high on the list of challenges, cited by 38% of respondents.

No executive support, no budget. At TDWI, we repeatedly hear from organizations that the reason it is hard to get any analytics project up and running is lack of executive support. Some executives don’t have the vision and don’t understand the value. Some lack the knowledge of what analytics is all about. Often, analytics comes about when there is a change in leadership with a vision to deploy analytics. The budget issue goes hand in hand. About 28% of respondents cited these challenges.

Overcoming the People Challenges

What are respondents doing to overcome the challenges? Their responses naturally fell into several areas.

Starting small. Often organizations claim that slow and steady wins the race. Respondents spoke about "small projects owned by the business to prove the benefit/capability before pushing for a wider-reaching solution/business case." They spoke about taking "baby steps" to "solve problems consistently and gain trust" and "taking one step back to focus on people and process first."

Education. Skills ranked at the top of the list of challenges. To overcome this challenge, some talked about training from within because current employees understand the business. Others spoke about hiring externally and training internally. One respondent mentioned hiring a consulting company to do a POC and then that consulting company helps train the organization. Others are looking at tools that are easier to use. Past TDWI research indicates that organizations typically use a combination of approaches to build competency.

Communication. In addition to training staff to understand technologies, many respondents cited the need for educating their organizations about the value of embedding analytics. Often this involves socializing and evangelizing the concepts, especially if an executive is not yet on board. As one respondent explained, “We are using communication, demos, and actively engaging the business.”

Unfortunately, there is no silver bullet when it comes to overcoming people challenges related to embedding analytics. Some organizations are lucky in that everyone seems to be on board from the get-go. Often this happens in smaller or greenfield deployments in new companies that are growing quickly. Sometimes businesses have to wait until a new executive joins the organization to get something moving. However, many organizations spend the time to educate and communicate to get their analytics efforts moving. It is harder, but it can be done.

Read the full report: Download TDWI Best Practices Report: Operationalizing and Embedding Analytics for Action (Q1 2016).

 
Flashpoint Rx
Mistake: Failure to Show Your Voice

Neutral stories are boring stories that have no impact. As a storyteller, it is important to show your voice and to take a position.

Don’t just state the facts—interpret them. Show your beliefs, your opinions, and the reasons why you think the story needs to be told. Don’t avoid conflict—acknowledge it and use it to engage the audience. Don’t be too cautious. Stay away from neutral language and style. Hiding your beliefs and emotions actually diminishes the impact of the story. If you appear to be a disengaged storyteller, you are sure to have a disengaged audience.

Make a considered and conscious choice about your storyteller voice and point of view. Choose first-person narrative when you, as the storyteller, are the focal point of the story and are describing your own experiences. First-person storytelling speaks of "I" and creates a very strong connection when it elicits visceral audience responses such as awe, fear, empathy, or respect for the storyteller.

Choose second-person narrative when the focal point is the audience and the storyteller perspective is that of message bearer. Second-person storytelling speaks of "you" and connects strongly when a compelling message draws the audience into the story.

Choose third-person narrative to focus on characters other than the audience or the storyteller. Third-person storytelling speaks of "he" and "she" characters by name. Bringing the audience into the story is a two-stage process. The storyteller shows emotional reaction—empathy, disdain, etc.—for characters in the story in a way that guides the audience to find their own emotional responses.

Read the full issue: Download Ten Mistakes to Avoid in Data Storytelling (Q2 2016).