TDWI Articles

Data Insights and the Role of AI with Philip Zelitchenko

Organizations can overcome barriers to timely insights. ZoomInfo’s Philip Zelitchenko explains how AI can help.

As part of TDWI”s three-part podcast series reviewing TDWI’s latest Best Practices Report (BPR) about reducing time to insight and maximizing the benefits of real-time data, Philip Zelitchenko, vice president of data and analytics for ZoomInfo, offers his perspective on barriers to insights, strategies for improving insights (and the role of data quality), and the use of AI (and its future). [Editor’s note: Speaker quotations have been edited for length and clarity.]

For Further Reading:

Reducing Time to Insight with Kevin Bohan

From Peak Hype to Reality: Deriving Actionable Insights and ROI from AI

How to Overcome the Insights Gap with AI-Powered Analytics

The conversation started with a discussion of the barriers organizations experience trying to get better insights faster. Zelitchenko identified several barriers, starting with silos.

“When your data exists in multiple locations across the organization, you have data that lives on the enterprise side. You have systems such as your CRM app and you have operational systems that live on the product side. When a visitor goes to your website, all those engagements are recorded in different areas, both on the enterprise side and on the product side.”

Zelitchenko suggests your enterprise must first determine how quickly it needs those insights. The answer, he says, depends in part on whether you’re a B2B or a B2C company. “I call this out because the freshness of the data and real-time or near-real-time applications are different for different use cases across these two types of companies. Once the data is being brought together from those different systems into a centralized place, the question then becomes: how often do I need this data fresh and for what purpose? In the B2C world, time-to-insight is critical. You have people coming on your website, they go on Amazon, and they look at different products. You're trying to capture the buyer at this moment. In the B2B world, it varies. The sellers in a B2B world can consume the insights and signals that are captured in a higher latency environment.”

What is the best strategy for dealing with data quality and data trust problems to get the best possible insights? Zelitchenko offered an example to showcase some of the different challenges caused by poor data quality -- many B2B companies don’t understand who their customer is.

“Suppose our company has 16 Google accounts in our CRM system. These are 16 different sub-teams at Google that use our product. We need to be able to capture all these instances and be able to bring them into a single group. The ability to see that, the ability to understand that within our total addressable market, under a company called Google there are multiple buying groups is important. Being able to distinguish which ones are relevant, which ones have our ideal customer profiles, that capability is critical for our ability to go to market in a very efficient way. That's one big piece of that puzzle that needs to be solved.”

How do you get there? “At ZoomInfo, the tools we use vary between groups. In some cases, real-time usage and data quality are important, so we implement our own solution. There are other cases where data is not real time, and we use other tools to solve those use cases, monitor that data, and make sure that it doesn't change often. Different products, different use cases -- but we try to monitor data quality across the entire funnel.”

Among the current tools of interest in a variety of areas -- including data analytics and insights -- is artificial intelligence. What impact does AI -- including generative AI -- have? Can it help drive faster insights?

The laest innovations involve LLMs, according to Zelitchenko, who offers as an example its Copilot product the company will release shortly. “LLMs and generative AI will have an impact on the efficiency and productivity of the individual contributor. If you look at our Copilot, for example, it gives the account managers and others the ability to provide a higher quality of coverage because some parts of their work are automated. AI provides more coverage on the accounts they cover and provides a higher quality of coverage.

“Another example is another copilot product -- Microsoft Copilot. What its copilot does is help software engineers be more efficient in the code they write. We saw about 26-27% improvement. Our CTO just posted about it on social media versus other companies that talked about 40% improvement. Does that mean that we need less people writing code? No, because what happens is that the tide raises all boats -- so now every company has that efficiency boost.”

Improved efficiency of all employees means “now we're able to generate more code which means releasing more features and building more products at a higher quality in the long term because we can expedite some of the work we do. That’s cool!”

Implementing copilots isn’t without its own challenges. The LLMs copilots use are based on the data you feed them, but the quality of that data is key. “Most LLMs suffer from hallucinations -- the amount varies between really good models (at about 4-5% hallucinations) to not-as-good models (at about 12-14%). In those cases, you need more data governance to ensure that the data quality of what comes out is valid and can be utilized for production.” We may be years away from getting these LLMs where those hallucinations are down to 1%. It seems like 4% or 5% is a pretty good number, but if you're operating at larger scale, that can add up pretty fast.

Down the road, does Zelitchenko expect that every company will have its own personal large language model operating within its own organization? “The industry is moving pretty fast. Improvements are happening at large scale every week, and the cost of serving and training these LLMs is going down significantly. In a year or two, the ability for any size business to operate on their own models is going to be a given and it will become a commodity. You can also see this based on the open source approach taken by many companies that will enable anyone to take a model and train it.”

To get started with this technology, Zelitchenko recommends the agile method. “Start with something small. Try it out and make sure it works. Test it on your customers with small groups. Make sure employees play around with it.”

[Editor’s note: You can listen to the podcast on demand here.]

 

TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.