Skip to main content

Getting Started with AiKA

Overview

What is AiKA?

AiKA (AI Knowledge Assistant), currently in its development phase, is an internal AI-powered assistant that helps you conversationally discover, understand, and leverage knowledge across the company. AiKA can currently integrate seamlessly with Backstage Search and TechDocs, providing answers to technical and organizational questions by drawing on a continually updated knowledge base—including TechDocs, and software catalog, with more sources to come.

Why AiKA?

  • Controlled Knowledge Base: You get answers based on a curated and relevant documents within your organization, which will be much more accurate than general LLM usage. Inquire about anything across TechDocs, Software Catalog, and more in one place.
  • Bring your own models: You choose what models and providers AiKA uses in the backend based on what works for your organization.
  • Compliance: Because the underlying LLM and re-ranking technologies are highly customizable, you know exactly where data is flowing.

Knowledge Sources: What AiKA is aware of

AiKA draws from internal sources (more to come):

  • TechDocs: Internal documentation, including engineering, data, product, and, process docs.
  • Confluence: Confluence spaces indexed by Backstage search can be used as a knowledge source.
  • Catalog: Software catalog information can be used as a knowledge source.

In general, AiKA will be able to draw from any source that is indexed by Backstage Search, and requires that If you have a custom collator you've set up, you can add the name associated with the collator to the aika > knowledgeTypes section in the Config Manager.

To configure AiKA to draw from specific sources, you can add the source to the aika > knowledgeTypes section in the Config Manager. The most commmon sources are:

aika:
knowledgeTypes:
- techdocs
- confluence
- software-catalog

Installation

Enabling AiKA in Portal

You have the option to enable or disable AiKA at any time through the Config Manager.

Set up parameters

Screenshot detailing AiKA Config

aika > models > language (required): The LLM provider and model to use for the chat completion and keyword extraction.

  • Format: PROVIDER|MODEL
  • Example: openai|gpt-4o

aika > models > re-ranking (optional): Provide a re-ranking model. This is useful for improving the relevance of the generated content and reducing inconsistencies created by document indexing.

The integrations page lists LLM providers, reranking capability providers, and their models that AiKA supports. Please enable the module you'd like to use in the Config Manager, and fill out its appropriate configuration.

Using AiKA

Preferences

Preferences Screenshot

FunctionUse When…Avoid When…
Use Internal KnowledgeYou need company-specific, up-to-date answers based on your peer's contributions to TechDocs and Software CatalogYour question is general/public
StreamingYou want a real-time, incremental answer to follow along generationYou want a single complete answer and response time is negligible
System PromptYou want to customize model behavior or test prompt changesYou’re happy with default behavior

Customizing the System Prompt

The system prompt can simply be a text or a Jinja2 template that supports the following variables. You can verify the validity of the system prompt using this tool.

  • current_date (str): The current date.

  • use_internal_knowledge (bool): Reflects the current setting of the internal knowledge.

  • internal_knowledge (str or null): The internal knowledge retrieved from the the knowledge base related to the current conversation. If the use internal knowledge setting is disabled, or there is no relevant information found in the knowledge base, this will be null.

For example, a system prompt that uses the current date and the internal knowledge would look like this:

You are a helpful assistant that can answer questions about the company.

Today's date is {{ current_date }}.

{% if use_internal_knowledge %}
The following is relevant information from the knowledge base:
{{ internal_knowledge }}
{% endif %}