Getting Started with AiKA
Overview
What is AiKA?
AiKA (AI Knowledge Assistant), currently in its development phase, is an internal AI-powered assistant that helps you conversationally discover, understand, and leverage knowledge across the company. AiKA can currently integrate seamlessly with Backstage Search and TechDocs, providing answers to technical and organizational questions by drawing on a continually updated knowledge base—including TechDocs, and software catalog, with more sources to come.
Why AiKA?
- Controlled Knowledge Base: You get answers based on a curated and relevant documents within your organization, which will be much more accurate than general LLM usage. Inquire about anything across TechDocs, Software Catalog, and more in one place.
- Bring your own models: You choose what models and providers AiKA uses in the backend based on what works for your organization.
- Compliance: Because the underlying LLM and re-ranking technologies are highly customizable, you know exactly where data is flowing.
Knowledge Sources: What AiKA is aware of
AiKA draws from internal sources (more to come):
- TechDocs: Internal documentation, including engineering, data, product, and, process docs.
- Confluence: Confluence spaces indexed by Backstage search can be used as a knowledge source.
- Catalog: Software catalog information can be used as a knowledge source.
In general, AiKA will be able to draw from any source that is indexed by Backstage Search, and requires that if you have a custom collator you've set up, you can add the name associated
with the collator to the aika > knowledgeTypes
section in the Config Manager.
To configure AiKA to draw from specific sources, you can add the source to the aika > knowledgeTypes
section in the Config Manager.
The most common sources are:
aika:
knowledgeTypes:
- techdocs
- confluence
- software-catalog
Installation
Enabling AiKA in Portal
You have the option to enable or disable AiKA at any time through the Config Manager.
Set up Parameters
aika > models > language (required): The LLM provider and model to use for the chat completion and keyword extraction.
- Format: PROVIDER|MODEL
- Example: openai|gpt-4o
aika > models > re-ranking (optional): Provide a re-ranking model. This is useful for improving the relevance of the generated content and reducing inconsistencies created by document indexing.
The integrations page lists LLM providers, reranking capability providers, and their models that AiKA supports. Please enable the module you'd like to use in the Config Manager, and fill out its appropriate configuration.
aika > knowledgeTypes (optional): The types of data AiKA is aware of. It will default to only techdocs
if unspecified.
Custom Knowledge Sources
By default, only the TechDocs knowledge source is enabled in AiKA. To improve the relevance and accuracy of AiKA's answers, you can add other knowledge sources alongside TechDocs. Expanding your knowledge base helps AiKA better respond to more diverse queries.
Getting Started
If you want to add more sources, here are some helpful resources:
- Community-created collators
Discover existing collators for different platforms and data sources. Using a community collator can save setup time. - How to install custom packages in Portal
Guidance for installing plugins in your Backstage instance. - How to create a custom collator
For when you want to index a data source not already supported.
Example: Adding Confluence as a Knowledge Source
1. Add the Confluence Collator Module
- Go to the Config Manager.
- Add the following module:
This module enables indexing Confluence content in your search backend.
@backstage-community/plugin-search-backend-module-confluence-collator
2. Configure the Module
- Review the Confluence collator module documentation to identify required configuration fields (such as Confluence base URL, credentials, etc.).
- In the Config Manager, navigate to the search plugin configuration at
/config-manager/search
. - Update your configuration values to include the Confluence collator settings.
3. Register the New Document Type with AiKA
- After adding the module, a new document
type
(see: collator code) will be available. - To make AiKA aware of this new knowledge source:
- Navigate to AiKA’s configuration in Config Manager at
/config-manager/aika
. - Update the
knowledgeType
list to include the new document type (e.g.,confluence
).
- Navigate to AiKA’s configuration in Config Manager at
4. Re-index the Knowledge Base
- After configuration, re-index your knowledge base (by default, it will start on its own. It may take a couple of minutes to fully execute). This process will make the new Confluence documents available to AiKA for answering queries.
Using AiKA
Preferences
Function | Use When… | Avoid When… |
---|---|---|
Use Internal Knowledge | You need company-specific, up-to-date answers based on your peer's contributions to TechDocs and Software Catalog | Your question is general/public |
Streaming | You want a real-time, incremental answer to follow along generation | You want a single complete answer and response time is negligible |
System Prompt | You want to customize model behavior or test prompt changes | You’re happy with default behavior |
Customizing the System Prompt
The system prompt can simply be a text or a Jinja2 template that supports the following variables. You can verify the validity of the system prompt using this tool.
-
current_date (str): The current date.
-
use_internal_knowledge (bool): Reflects the current setting of the internal knowledge.
-
internal_knowledge (str or null): The internal knowledge retrieved from the the knowledge base related to the current conversation. If the use internal knowledge setting is disabled, or there is no relevant information found in the knowledge base, this will be null.
For example, a system prompt that uses the current date and the internal knowledge would look like this:
You are a helpful assistant that can answer questions about the company.
Today's date is {{ current_date }}.
{% if use_internal_knowledge %}
The following is relevant information from the knowledge base:
{{ internal_knowledge }}
{% endif %}
Tracing
Why Trace AiKA?
Tracing provides detailed observability into AiKA's chat processing pipeline, capturing the flow of how your chats go from initial question to knowledge retrieval, re-ranking and then to final response generation. This can be useful for debugging purposes and understanding what kind of queries and knowledge bases work best within your portal instance as well as capture the efficacies in the model outputs.
How to Enable Tracing
Tracing can be configured through the Config Manager under the AiKA settings. In the aika > tracing: enableTracing section, just toggle the option and start the module that you want to export traces to. Currently we support exporting traces to the Arize Phoenix platform only, which requires an additional apiKey
and url
in the module options.
When the enableTracing flag is toggled on and the module has started, traces will be captured for each conversation, providing insights into the AI's decision-making process.