By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Upside - Where Data Means Business

How to Deploy Generative AI Effectively in 2024

Enterprises that educate themselves about the benefits and limitations of generative AI and LLMs -- and conduct effective POCs – will reap the benefits of this technology.

The dramatic introduction to the general public of generative AI and large language models (LLMs) presents business leaders with an inflection point they must embrace now -- but with care. On the one hand, everyone today has access to the most groundbreaking technological advancement since the internet. On the other hand, there are fundamental challenges with the technology that must be well understood and addressed. 

For Further Reading:

How Generative AI Is Changing How We Think About Analytics

Generative AI Poses Security Risks

The Problem and Promise of Generative AI

Although it is clearly tempting to adopt a "wait-and-see" approach, waiting too long to address generative AI can potentially cause real business issues that will harm the business and prevent it from keeping up with competitors. As a result, generative AI will serve as the pivotal tool enabling businesses to enrich, empower, and engage both employees and customers -- with or without the active involvement of the “wait-and-see” crowd. 

Enterprises looking to leverage generative AI must adopt an objective methodology to cut through the noise. To convince the organization that it is feasible to move forward now, leaders must conduct a proof of concept (POC) or trial using the organization’s data that can be executed quickly, function correctly (without hallucinations), and showcase the crucial analytics and technologies that will enable successful operations now and in the future. It must also address concerns about security and scalability to avoid managing disjointed tools and provide evidence of an initial return on investment to prove the project is worthwhile.  

In 2024, organizations will need to rethink how they deploy generative AI practically and make the technology effective for business users by addressing concerns about reliability, resource consumption, and cost-effectiveness. As a result, expect to see the following trends in the coming year.

Trend #1: Generative AI and large language model hype will start to fade 

Without a doubt, generative AI is a major leap forward. However, many people have wildly overestimated what is actually possible. Although generated text, images, and voices can seem incredibly authentic and appear as if they were created with the same thoughtfulness and desire for accuracy as a human would use, they are really just statistically relevant collections of words or images that fit together well (but in reality may be completely inaccurate). The good news is the actual outputs of AI can be incredibly useful if all their benefits and limitations are fully considered by the end user.  

As a result, 2024 will usher in reality checks for organizations on the real limitations and benefits generative AI and LLMs can bring to their business. The outcomes of those assessments will reset the strategies and adoption of those technologies. 

Vendors will need to make these benefits and limitations apparent to end users who are appropriately skeptical of anything created by AI. Key elements such as accuracy, explainability, security, and total cost must be considered. 

In 2024, the generative AI space will settle into a new paradigm for enterprises, one in which they deploy just a handful of generative AI–powered applications in production to solve specific use cases.

Trend #2: Natural language interfaces will become ubiquitous

Imagine this scenario: you walk into a brick-and-mortar retail store. When you ask the store assistant a question, instead of a verbal response, you are pointed to a display with a list of options or are taken to a whiteboard to sketch an illustration that includes minimal text. In this silent exchange, the richness of human-level communication is replaced by a menu of options or a group of visuals.

Odd, right? Yet, this has been the paradigm for most websites for the past 25 years.

There is already a race to create “intimacy at scale on the web” enabled by generative AI and LLMs. It is complicated to attain and the challenge to achieve this personalization is well understood. A small number of vendors have worked out how to overcome these challenges in a production environment to enable accurate and trusted interactions with these language models.   

As a result, and as these positive experiences multiply in 2024, more individuals will become comfortable with leveraging and maximizing their use of natural language interfaces. 

Trend #3: Businesses will learn that adding generative AI to existing tools will not address foundational weaknesses 

Although generative AI can provide valuable assistance, it cannot miraculously solve foundational issues related to volumes of information and relevance of searches through that data. If an existing tool was unable to reliably surface relevant information immediately ten months ago, bolting generative AI onto it will fail to make it work better. Similarly, if a solution did not effectively answer questions previously, the mere addition of generative AI will not change its performance.

Put simply, when it comes to generative AI, garbage in produces garbage out.

In 2024, a few implementations of retrieval augmented generation (RAG) will emerge as the only possible way to successfully eliminate hallucinations. RAG is an AI framework that attempts to provide a narrow and relevant set of inputs to generative AI to yield an accurate and reliable summary. However, the successful execution of this framework is no easy task, and consequently not all instances of RAG are created equal. For instance, if RAG yields pages of results that may or may not be accurate and defers the task of deciphering the correct answer to the generative AI, the outcome will once again be subpar and unsuitable for business use. 

Generative AI faces the same challenge as a human would in trying to summarize ten pages of relevant and irrelevant data. In contrast, both generative AI and humans do a much better job synthesizing ten relevant sentences. Furthermore, RAG alone can still fail to surface highly accurate answers when it comes to answering questions containing domain-specific context. Boosting the result's relevance requires last-mile fine-tuning of the LLM. The combined RAG plus fine-tuning approach will help achieve production-level performance of the generative AI solution for companies next year.

Reflection

In the coming year, enterprises that educate themselves about the benefits and limitations of generative AI and LLMs and conduct effective POCs will reap the benefits, while the “wait-and-see” contingent will stand out starkly with less engaging interfaces for their customers.

Organizations that effectively assess accuracy, explainability, security, and cost will offer their end users a personalized experience based on the ubiquity of natural language interfaces. RAG will emerge as a way to eliminate hallucinations, but very few will prove their worth.

About the Author

Ryan Welsh is the CEO and founder of Kyndi, a global provider of the Kyndi Generative AI Answer Engine, an AI-powered platform that finds accurate and direct answers to questions in one click. Before founding Kyndi, Ryan was a senior associate at NextFED in Arlington, VA, a leading deep tech commercialization and M&A firm for the federal market. He worked with Los Alamos National Laboratory to launch startups based on technology developed at the lab. At NextFED, Ryan led the commercialization of technologies including quantum cryptography, cyber, small satellites, and artificial intelligence. For more information, visit kyndi.com or follow on LinkedIn or X/Twitter.


TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.