By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Upside - Where Data Means Business

The Three Most Important Emerging AI Trends in Data Analytics

AI is getting a lot of buzz now, but where is it headed? Here are three trends to keep watching.

It has been a crazy year for data analytics, and three important trends have emerged that will forever change this practice. Until this year, we had concepts such as “big data” that promoted lots of storage but were very light on what the heck to do with the massive unstructured data that was collected.

For Further Reading:

Should AI Require Societal Informed Consent?

Using AI to Advance Analytics

The Five Ds for AI Project Deployment Success

Then came data analytics which was all about analyzing this data, but that didn’t focus on the big problem: you needed data scientists (who were in short supply) with the unique ability to talk to business managers (a skill that was almost non-existent). Then came digital transformation, a much broader term that focused on what you needed to do but was light on why you needed to do it. Now we are all about working with AI, which could finally make all of what came before it work.

Let’s look at the three most important trends in data analytics.

Trend #1: Increased emphasis on conversational AI and large language models

This trend directly addresses the need to bypass data scientists and create a better interface so business managers can ask the system directly for what they need. When properly trained and implemented, these tools can deliver reports and detailed answers in minutes, something that would have typically taken weeks or even months.

Although the future will bring greater capability for AI to learn from and adapt to the user, current systems are mostly inference engines that treat the user more generically and will require a higher level of training. Even so, the result is still generally faster for most, though not all, implementations that require a data scientist.

Trend #2: Enterprises need the assurance of indemnified data sources

As AI-enabled applications performing analytics are spun up, it is increasingly critical that the training and production data sets are unbiased and incorruptible. Bad training or production data sets that are biased or just out of date can lead the system to make bad recommendations and worse decisions. Ensuring the safety of the data includes a legal process (asking the firm to guarantee that the data in the repository isn’t owned by someone else who might take exception to its use) and some form of indemnification.

The use of indemnification isn’t consistent, however, with some of the more mature firms indemnifying their customers and some of the other firms asking for indemnification from their customers. This last has proven somewhat problematic as not everyone agreeing to this indemnification is getting the proper sign-off from their legal department or outside counsel.

Trend #3: The popularity of hybrid AI grows

AI is very expensive to run in the cloud because it uses substantial processing and storage resources. However, if you can shift the load to the client, it frees up those resources and allows for faster results with some loss of trainability and customization as, typically, the clients use a compressed data set and inferencing that is more limited than the capabilities of a cloud implementation. However, most cloud implementations use limited inferencing models to reduce operational costs, so the lack of flexibility (at least with current technologies and implementations for the client) is more of a theoretical problem than it is an actual one.

This is driving a massive rush by processor and platform companies and even AI companies such as ChatGPT to develop focused neural processing units (NPUs) that can be put in these desktop systems to better optimize them for this new hybrid reality. This hybrid trend started in 2023, but it won’t truly reach its potential until after 2024 when new and vastly more powerful NPUs are expected to come to market.

Final Thoughts

As they reach maturity (likely later in this decade), all three of these trends will eventually provide the kind of benefits (in terms of actionable information and recommendations) that were always promised by data analytics but were rarely seen. The problem that will still need to be overcome is getting people to trust AI and not be afraid to use it. This will take additional time, particularly given that much of what these systems are currently using comes from corrupt data sets, poorly trained models, or incomplete implementations.

I expect much of these initial teething problems to mostly go away by the end of the decade, but until then, given the rapid advancement of this technology, making sure you have a vendor or consultant that is an expert on the technology and what you need from it should make a huge difference in the success of your AI-driven data analytics solution.

About the Author

Rob Enderle is the president and principal analyst at the Enderle Group, where he provides regional and global companies with guidance on how to create a credible dialogue with the market, target customer needs, create new business opportunities, anticipate technology changes, select vendors and products, and practice zero-dollar marketing. You can reach the author via email.


TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.