Skip to main content

AI Gateway

Overview

What is AI Gateway?

The AI Gateway is a centralized plugin for configuring and managing AI capabilities in Backstage. It allows you to set up and manage various AI models, providers, and re-ranking capabilities that can be used across different plugins, such as AiKA.

What You'll Accomplish

By the end of this guide, you'll have AI Gateway configured with one or more AI providers, enabling AI-powered features across your Portal instance.

Supported Providers

AI Gateway currently supports the following providers:

  • AWS Bedrock - Amazon's managed service for foundation models
  • Azure OpenAI - OpenAI models through Microsoft Azure infrastructure
  • OpenAI - Direct access to OpenAI's models (GPT-4, GPT-3.5, etc.)
  • Cohere - Cohere's language models and reranking capabilities
  • Google Generative AI - Google's Gemini models
  • Ollama - Self-hosted open source models

Step 1: Enable AI Gateway in Portal

You can enable or disable the AI Gateway at any time through the Plugin settings page.


Step 2: Configure AI Providers

We provide support for a variety of AI providers and models. You can configure these in the Plugin settings page under the AI Gateway section.

Screenshot detailing AiKA Config

tip

You can have one or multiple providers configured at the same time. Plugins will allow you to select which provider and model to use, or you can set a default provider to use across all plugins.

note

The baseURL and headers configuration options are useful when routing requests through a proxy or gateway. For direct connections to AI providers, these options can be omitted and the default endpoints will be used.

Provider Configuration

Navigate to Admin → Plugin Settings → AI Gateway to configure your providers.

OpenAI

FieldRequiredDescription
API KeyYesYour OpenAI API key
Base URLNoOptional, to point at a gateway
HeadersNoOptional, e.g. for authentication with gateway

Azure OpenAI

FieldRequiredDescription
API KeyYesYour Azure API key
Resource NameYes*Used in URL: https://{resourceName}.openai.azure.com/openai/v1
API VersionNoOptional, defaults to preview
Base URLNoOptional, to point at a gateway (overrides resourceName when provided)
HeadersNoOptional, e.g. for authentication with gateway

*Either Resource Name or Base URL must be provided.

AWS Bedrock

FieldRequiredDescription
RegionYesAWS region (e.g., us-east-1)
Access Key IDYesYour AWS access key ID
Secret Access KeyYesYour AWS secret access key
Session TokenNoOptional, if using temporary credentials
Base URLNoOptional, to point at a gateway
HeadersNoOptional, e.g. for authentication with gateway

Cohere

FieldRequiredDescription
API KeyYesYour Cohere API key
Base URLNoOptional, to point at a gateway
HeadersNoOptional, e.g. for authentication with gateway

Google Generative AI

FieldRequiredDescription
API KeyYesYour Google API key
Base URLNoOptional, to point at a gateway
HeadersNoOptional, e.g. for authentication with gateway

Ollama

FieldRequiredDescription
Base URLYesURL to your Ollama instance
HeadersNoOptional, e.g. for authentication with gateway

Step 3: Configure Default Models (Optional)

You can set a default provider that will be used across all plugins that support AI Gateway. The model fields are formatted as provider|model.

In the Plugin settings page under the Models section, you can configure:

Model TypeFormatDescriptionExample
Language Modelprovider|modelModel used for chat and text generationopenai|gpt-4
Reranking Modelprovider|modelModel used for search result rerankingcohere|rerank-v3.5
Embedding Modelprovider|modelModel used for text embeddingsopenai|text-embedding-3-small
Image Modelprovider|modelModel used for image generationopenai|dall-e-3

Next Steps

With AI Gateway configured, you can now:

  • Enable AiKA for AI-powered assistance in your Portal instance
  • Configure Semantic Search to use AI-powered semantic search capabilities
  • Explore Provider Integrations to learn more about each provider's specific features
  • Optimize Model Selection by testing different models for your use cases