Ollama
Overview
Ollama is an open-source platform for running large language models locally or on your own infrastructure. AiKA can be configured to use Ollama as a language model provider.
Configuration
Ollama requires a base URL to be set in the config manager. This is the static base URL of the Ollama instance you wish to use, and it will typically look like http://localhost:11434/api
.
aika:
ollama:
baseUrl: http://localhost:11434/api
To use Ollama with AiKA, set the following in the config manager. The language field will be formatted as provider|model
.
aika:
models:
language: ollama|<model-name>
Replace <model-name>
with the name of the Ollama model you wish to use (e.g., llama2, mistral, etc.).
Example
aika:
models:
language: ollama|llama2
Ensure your Ollama instance is running and accessible to AiKA. For more details on available models and setup, refer to the Ollama documentation.