Skip to content

Commit

Permalink
build+docs: Make ollama optional (#671)
Browse files Browse the repository at this point in the history
  • Loading branch information
topher-lo authored Dec 28, 2024
1 parent 8cad761 commit 9f4616b
Show file tree
Hide file tree
Showing 6 changed files with 62 additions and 13 deletions.
10 changes: 10 additions & 0 deletions docker-compose.ollama.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
services:
ollama:
image: ollama/ollama:${OLLAMA__VERSION}
ports:
- 11434:11434
volumes:
- ollama:/root/.ollama

volumes:
ollama:
12 changes: 0 additions & 12 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -172,21 +172,9 @@ services:
depends_on:
- temporal_postgres_db

ollama:
container_name: ollama
tty: true
restart: unless-stopped
networks:
- core
image: ollama/ollama:${OLLAMA__VERSION}
ports:
- 11434:11434
volumes:
- ollama:/root/.ollama
volumes:
core-db:
temporal-db:
ollama:

networks:
core:
Expand Down
Binary file added docs/img/self-hosting/ai-action.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
3 changes: 2 additions & 1 deletion docs/mint.json
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,8 @@
"group": "Deployment Options",
"pages": [
"self-hosting/deployment-options/docker-compose",
"self-hosting/deployment-options/aws-ecs"
"self-hosting/deployment-options/aws-ecs",
"self-hosting/deployment-options/ollama"
]
},
{
Expand Down
47 changes: 47 additions & 0 deletions docs/self-hosting/deployment-options/ollama.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
---
title: Self-hosted LLMs
description: Learn how to self-host LLMs in Tracecat with Ollama.
---

## Prerequisites

- Basic Tracecat deployment with [Docker Compose](/self-hosting/deployment-options/docker-compose)
- Minimum 10GB of disk space

## Instructions

<Note>
Deploying self-hosted LLMs is resource intensive with large downloads and large model weights.
The Ollama Docker image is 1.5GB+ large and model weights can vary greatly in size.

Only models less than 5GB in size are currently supported.
</Note>

Tracecat supports self-hosted LLMs through [Ollama](https://ollama.ai/).

Supported models:

- [`llama3.2`](https://ollama.com/library/llama3.2): 1.3GB
- [`llama3.2:1b`](https://ollama.com/library/llama3.2): 2.0GB

<Steps>
<Step title="Configure open source models">
Specify the open source models you wish to use in Tracecat by setting the `TRACECAT__PRELOAD_OSS_MODELS` environment variable in the `.env` file.

For example, to preload the `llama3.2` model, set the following:
```
TRACECAT__PRELOAD_OSS_MODELS=llama3.2
```
</Step>
<Step title="Deploy">
Deploy Tracecat with the Ollama docker compose extension:
```bash
docker compose up -f docker-compose.yml -f docker-compose.ollama.yml up -d
```
</Step>
<Step title="AI Action">
You can now use Tracecat's AI action to call your preloaded open source LLMs.
For example, to call the `llama3.2` model, you can specify the following arguments:
![AI Action](/img/self-hosting/ai-action.png)
</Step>
</Steps>
3 changes: 3 additions & 0 deletions docs/self-hosting/introduction.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,3 +14,6 @@ Choose from a number of deployment options listed below to get started.
Use Terraform to deploy Tracecat into ECS Fargate.
</Card>
</CardGroup>

Interested in using open source LLMs (e.g. llama3.1) in Tracecat's AI actions?
Check out our guide on deploying [self-hosted LLMs](/self-hosting/deployment-options/ollama) with Tracecat's Ollama Docker Compose extension.

0 comments on commit 9f4616b

Please sign in to comment.