By using website you agree to our use of cookies as described in our cookie policy. Learn More


Kong Open Sources Its AI Gateway for Democratizing Multi-LLM Use

Company unveils free open source “no-code” plugin suite supporting multi-LLMs and offering advanced prompt engineering and AI analytics for fast and secure high-performance AI adoption.

Note: TDWI's editors carefully choose vendor-issued press releases about new or upgraded products and services. We have edited and/or condensed this release to highlight key features but make no claims as to the accuracy of the vendor's statements.

Kong Inc., a developer of cloud API technologies, has released a suite of open-source AI plugins for Kong Gateway 3.6 that can turn any Kong Gateway deployment into an AI Gateway, offering support for multi-language learning model (LLM) integration.

By upgrading their current Kong Gateways, users can access a suite of six new plugins that are entirely focused on AI and LLM use. This will enable developers who want to integrate one or more LLMs into their products to be more productive and ship AI capabilities faster, while simultaneously offering architects and platform teams a secure solution that ensures visibility, control, and compliance on every AI request sent by the teams. Thanks to the tight integration with Kong Gateway, it is now possible to easily orchestrate AI flows in the cloud or on self-hosted LLMs with strong performance and low latency, which are critical to the performance of AI-based applications.

Multi-LLM Integrations, AI Analytics, and Advanced Engineer Prompts

Builders want the best AI for their use case and to rapidly iterate and build new capabilities without having to manage the specific cross-cutting concerns around AI usage.

By upgrading to Kong Gateway 3.6, they can access this new suite of plugins entirely focused on AI and LLM use. The suite of open source and free plugins delivers a range of new capabilities, including:

  • Multi-LLM integration. Kong Inc.'s "AI proxy" plugin enables seamless integration of multiple LLM implementations, offering native support for industry leaders such as OpenAI, Azure AI, Cohere, Anthropic, Mistral, and LLAMA. The standardized interface allows for simple switching between LLMs without modifying application code, facilitating the use of diverse models and rapid prototyping.
  • Central AI credential management. The "AI proxy" ensures secure and centralized storage of AI credentials within Kong Gateway. This design negates the need for credentials within applications, streamlining credential rotation and allowing updates directly from the gateway.
  • Layer 7 AI metrics collection. Leveraging the "AI proxy" plugin, users can now capture detailed Layer 7 AI analytics. This includes metrics such as request and response token counts, as well as usage data for LLM providers and models. Integration with third-party platforms such as Datadog, New Relic, and existing logging plugins in Kong Gateway (e.g., TCP, Syslog, Prometheus), is supported, enriching observability and offering insights into developer preferences.
  • No-code AI integrations. With the "AI request transformer" and "AI response transformer" plugins, AI capabilities are injected into API requests and responses without a single line of code. This allows for on-the-fly transformations such as real-time API response translations for internationalization, enriching and converting API traffic effortlessly.
  • Advanced AI prompt engineering. Kong introduces three new plugins dedicated to sophisticated prompt engineering. The "AI prompt template" facilitates the creation and central management of prompt templates within Kong Gateway, enhancing the agility of prompt updates without application code changes and ensuring compliance through approved template processes.
  • AI prompt decoration. The "AI prompt decorator" plugin allows for the consistent configuration of AI prompt contexts, automating the inclusion of rules and instructions with each AI request to enforce organizational compliance and restrict discussions on sensitive topics.
  • AI prompt firewall. The "AI prompt guard" offers a governance layer, establishing rules to authorize or block free-form prompts created by applications, ensuring that prompts adhere to approved standards before being transmitted to LLM providers.
  • Comprehensive AI egress with extensive features. The integration of these AI capabilities within Kong Gateway centralizes the management, security, and monitoring of AI traffic. It leverages over 1,000 existing official and community plugins for robust access control, rate limiting, and the creation of advanced traffic control rules. The AI Gateway is equipped from day one with all Kong Gateway features. Support for Konnect's developer portal and service catalog is also included.

After extensive research and collaboration with select customers and users of Kong Gateway, these newly released plugins address the most prevalent AI use cases. We anticipate the release of additional AI features in the future and welcome user feedback on these developments.

TDWI Membership

Get immediate access to training discounts, video library, research, and more.

Find the right level of Membership for you.