By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Upside - Where Data Means Business

Executive Perspective: Directions in Analytics

From digital twins and contextual AI to graph databases and analytics democratization, many analytics changes are ahead. Todd Blaschka, chief operating officer at TigerGraph, sorts out which trends are worth watching.

Upside: What technology or methodology must be part of an enterprise’s data or analytics strategy if it wants to be competitive today? Why?

For Further Reading:

How to Incorporate Graph Analytics into Your Information Strategy

How to Get More from Your Data in 2020

3 Signs of a Good AI Model

Todd Blaschka: The pandemic is changing data and analytics. Models based on historical data are no longer valid. They will require a greater variety of analytics techniques to understand the relationships in the data, incorporate real-time data, and increase the context in AI/ML models to reconfigure systems and thrive after this reset.

One strategy is to create a “digital twin” or replica of the business application. A twin helps an enterprise find new relationships in combinations of its data through simulating and modeling new scenarios with historical and real-time data that can be processed with machine learning and AI. This approach will help identify historical blind spots, optimize processes, or identify new and tactical opportunities.

Humans work with the digital twins to make decisions and take action based on the data and information collected, which requires fostering a collaborative environment with colleagues within IT, other enterprise functions, and business partners to monitor progress towards goals.

What one emerging technology are you most excited about and think has the greatest potential? What’s so special about this technology?

AI is powering so many applications and services, yet many times a day we wonder “Why was this recommended to me?” or “Why did the assistant do this?” These systems are powered by statistical learning that does not incorporate human context or human interaction (known as contextual AI) in the applications.

Contextual AI takes the human-centric view and approach to AI. This means it has sufficient perception of the user’s environment, situation, and context to reason properly. To do so, it finds relationships within data and across data types; it enhances discovery of unexpected dependencies from one stream of activity to many other streams and back again, forming a foundation for machine learning and AI applications.

Context comes from adding and finding relationships within data and across data types, people, places, things, and objects to draw on these patterns, which will deliver a more relevant outcome and value from the analytics.

For example, a customer journey can include the customer’s personal information, interaction history, and all other aspects of the customer journey being available to the agent during a service or sales interaction. When your agents know the context of the customer -- what they’re looking for and what they’ve already spoken to another agent about -- they can give better service, faster.

What is the single biggest challenge enterprises face today? How do most enterprises respond (and is it working)?

COVID-19 has created a time of uncertainty with key concerns including declining sales growth, changes in customer needs, and disruptions to new projects. These increasingly complex questions highlight the need for more accurate, contextually aware analytics to plan, optimize, prioritize, and focus on business investments.

To respond to these changing conditions, companies will leverage different types of analytics and AI processing that provide closer to real-time decision making that scales across the enterprise.

Use-case-focused solutions help enterprises address these challenges and plan for the rebound from the crisis. For example, supply chains that were paused and were set up for previous forecasts may have to pivot to meet new requirements. They need to understand their supply chain now (what parts, where are they, the actual quantity) and model new scenarios based on the new demands to understand the impact of a part shortage on a customer order or how much production from product A should be switched to product B.

Is there a new technology in data or analytics that is creating more challenges than most people realize? How should enterprises adjust their approach to it?

As AI becomes an increasing part of our daily lives, from machine learning-powered predictive analytics and conversational applications to autonomous machines and hyper-personalized systems, we are finding that the need to trust these AI-based systems with all manner of decision making and prediction is paramount.

However, most people don’t know how AI systems make the decisions they do. Many of the algorithms used for machine learning are not able to be examined after the fact to understand specifically how and why a decision was made (e.g., why was this person denied a loan?). We must be able to fully understand how AI decisions are made in order to trust them.

To solve this problem, companies are developing so-called explainable AI programs that describe applications’ rationale, characterize their strengths and weaknesses, and convey an understanding of how they work. These programs will build more trust and aid in the broader adoption of AI.

What initiative is your organization spending the most time/resources on today?

At TigerGraph, our entire team is focused on creating an easier way for companies to ask business logic questions of their data and receive real answers. This means that relationships in the data form the analytics value and requires a platform to store, analyze, and interoperate with current systems. Gartner predicts that by 2023, graph technologies will facilitate rapid contextualization for decision making in 30 percent of organizations worldwide.

To get there requires democratizing graph analytics. The power of graph databases and analytics has until now been limited to technical users. Our mission is to make graphs accessible to everyone by enabling nontechnical users to accomplish as much with graphs as the experts do.

For our customers and partners, this means nontechnical users can produce and run graph queries simply by drawing the patterns they want, similar to visual data modeling. No coding experience is needed.

Where do you see analytics and data management headed in 2020 and beyond? What’s just over the horizon that we haven’t heard much about yet?

Sophisticated and complex analytics on a greater variety of data is becoming more strategic. Analyzing the data will not require you to be a coder. It will be democratized such that all parts of an organization can access analytics systems and gain actionable insights to help the business thrive.

Describe your product/solution and the problem it solves for enterprises.

TigerGraph connects data silos for deeper, wider, operational analytics at scale. It powers applications such as fraud detection, customer 360, MDM, IoT, AI, and machine learning. Four out of the top five global banks use TigerGraph for real-time fraud detection. Over 50 million patients receive care path recommendations to assist them on their wellness journey; 300 million consumers receive personalized offers with recommendation engines powered by TigerGraph. The energy infrastructure for 1 billion people is optimized by TigerGraph for reducing power outages. TigerGraph Cloud brings the power of advanced analytics to every business user and data scientist.

About the Author

James E. Powell is the editorial director of TDWI, including research reports, the Business Intelligence Journal, and Upside newsletter. You can contact him via email here.


TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.