Quick start

This quick start explains how to use Ollama to provide LLM capabilities to Liz. You can switch providers after the initial installation by referring to How-tos for Admin.

Prerequisites

The current version of the AI Assistant requires the following components:

Make sure you have the rights to deploy CRD (cluster_admin) on the host cluster.

Technical Requirements

Here is a table of the supported AI components and their requirements.

You only need to meet the requirements for the specific components you intend to run. Requirements will vary based on the specific large language model (LLM) you choose to deploy.

LLM Model Requirements GPU required? GPU Requirement

gpt-oss:20b

Ollama installed

NVIDIA RTX A5000, NVIDIA RTX 4090 or similar (Minimum 24Gb VRAM)

qwen3-embedding:4b

Ollama installed

NVIDIA RTX A5000, NVIDIA RTX 4090 or similar (Minimum 24Gb VRAM)

gpt-oss:120b

Ollama installed

NVIDIA A100 or similar (Minimum 80Gb VRAM)

Gemini

Google Workplace Account

N/A

ChatGPT

OpenAI account

N/A

If you run your own Ollama, please make sure to have at least gpt-oss:20b for the local model and qwen3-embedding:4b for the embeddings model with RAG enabled. Consult Ollama documentation on how to pull models.

Install Liz: The Rancher AI Assistant

Installation of Liz is a two step process:

  • Deploy the agent and the MCP via the provided Helm chart.

  • Deploy the Rancher UI extension.

Install the Agent and the MCP on the local cluster

  1. Add the Helm repository:

    helm repo add rancher-ai https://rancher.github.io/rancher-ai-agent
    helm repo update
  2. Deploy the AI Agent chart on the Local cluster.

    Create a values.yaml file setting (for Ollama):

    llmModel: "gpt-oss:20b"
    ollamaUrl: "http://ollama:11434"
    activeLlm: "ollama"

    You can change later those settings and providers in Rancher’s “Global Settings → AI Assistant tabs”

    helm install rancher-ai-agent rancher-ai/agent \
      --namespace cattle-ai-agent-system \
      --create-namespace \
      -f values.yaml

Install the UI extension

Open the Rancher Manager UI and navigate to the 'Extensions' page.

  1. On Rancher Manager, click on Extension in the menu bar.

    Extension button in menu bar
  2. Use the three-dot menu in the upper right and select 'Manage Repositories'.

    Menu to manage repositories
  3. Click 'Create' to add the repository.

  4. Configure the repository:

  5. Click 'Create'.

  6. Wait for the ai-assistant-ui repository status to be Active.

  7. Go back to the 'Extensions' page and select the 'Available' tab.

    Configure the available repository.
  8. Find the AI Assistant card and click 'Install'.

  9. Select a version (or use the latest by default) and click 'Install'.

    Select a version or use the latest.
  10. Once the extension has finished installing, click the 'Reload' button that appears at the top of the page.