Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs update for multi llm support + mintlify upgrade #13

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
149 changes: 149 additions & 0 deletions docs.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,149 @@
{
"$schema": "https://mintlify.com/docs.json",
"theme": "mint",
"name": "Potpie",
"colors": {
"primary": "#0D9373",
"light": "#07C983",
"dark": "#0D9373"
},
"favicon": "/favicon.svg",
"navigation": {
"tabs": [
{
"tab": "Documentation",
"groups": [
{
"group": "Get Started",
"pages": [
"introduction",
"quickstart"
]
},
{
"group": "How to use Agents",
"pages": [
"agents/introduction",
"agents/debugging-agent",
"agents/qna-agent",
"agents/integration-test-agent",
"agents/unit-test-agent",
"agents/code-changes-agent"
]
},
{
"group": "API Access ",
"pages": [
"agents/api-access"
]
},
{
"group": "Custom Agents",
"pages": [
"custom-agents/introduction",
"custom-agents/configuration",
{
"group": "Tools",
"pages": [
"custom-agents/tools",
"custom-agents/tools/get_code_from_probable_node_name",
"custom-agents/tools/get_code_from_node_id",
"custom-agents/tools/get_code_from_multiple_node_ids",
"custom-agents/tools/ask_knowledge_graph_queries",
"custom-agents/tools/get_nodes_from_tags",
"custom-agents/tools/get_code_from_node_name",
"custom-agents/tools/get_code_graph_from_node_id",
"custom-agents/tools/get_code_graph_from_node_name",
"custom-agents/tools/change_detection"
]
}
]
}
]
},
{
"tab": "Open Source",
"openapi": {
"source": "https://production-api.potpie.ai/openapi.json",
"directory": "open-source"
},
"groups": [
{
"group": "Open Source",
"pages": [
"open-source/setup",
"open-source/getting-started",
"open-source/llms",
"open-source/ollama-integration/ollama"
]
},
{
"group": "Partners",
"pages": [
{
"group": "AgentOps-AI",
"pages": [
"open-source/agentops/agentstack-qna",
"open-source/agentops/agentstack-lld",
"open-source/agentops/agentstack-unittest"
]
},
{
"group": "CrewAIInc",
"pages": [
"open-source/crew-ai/crewai-qna",
"open-source/crew-ai/crewai-lld",
"open-source/crew-ai/crewai-unittest"
]
}
]
}
]
}
],
"global": {
"anchors": [
{
"anchor": "Documentation",
"href": "https://docs.potpie.ai",
"icon": "book-open-cover"
},
{
"anchor": "Community",
"href": "https://discord.gg/ryk5CMD5v6",
"icon": "discord"
},
{
"anchor": "Blog",
"href": "https://www.potpie.ai/blog",
"icon": "newspaper"
}
]
}
},
"logo": {
"light": "/logo/light.png",
"dark": "/logo/dark.png"
},
"navbar": {
"links": [
{
"label": "Support",
"href": "mailto:[email protected]."
}
],
"primary": {
"type": "button",
"label": "Star us on GitHub ⭐️ ",
"href": "https://github.com/potpie-ai/potpie"
}
},
"footer": {
"socials": {
"x": "https://x.com/potpiedotai",
"github": "https://github.com/potpie-ai/potpie",
"linkedin": "https://www.linkedin.com/company/potpieai",
"discord": "https://discord.gg/ryk5CMD5v6"
}
}
}
77 changes: 77 additions & 0 deletions open-source/llms.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
---
title: "LLMs"
description: 'Setup for multiple LLMs'
---


Potpie is designed to work with various large language models (LLMs).This documentation outlines the types of models supported by Potpie, how to configure them.

## Supported Models

Potpie supports a variety of Large Language Models (LLMs). Below is a list of the supported models, categorized by provider and size:

---

### 1. **OpenAI Models**

- **Small Model:**
`openai/gpt-4o-mini`

- **Large Model:**
`openai/gpt-4o`

---

### 2. **Anthropic Models**

- **Small Model:**
`anthropic/claude-3-5-haiku-20241022`

- **Large Model:**
`anthropic/claude-3-7-sonnet-20250219`

---

### 3. **DeepSeek Models**

- **Small Model:**
`openrouter/deepseek/deepseek-chat`

- **Large Model:**
`openrouter/deepseek/deepseek-chat`

---

### 4. **Meta Llama Models**

- **Small Model:**
`openrouter/meta-llama/llama-3.3-70b-instruct`

- **Large Model:**
`openrouter/meta-llama/llama-3.3-70b-instruct`

---

### 5. **Google Gemini Models**

- **Small Model:**
`openrouter/google/gemini-2.0-flash-001`

- **Large Model:**
`openrouter/google/gemini-2.0-flash-001`

---

## Configuration

### Setting Up API Keys

Before using any model, ensure that the appropriate API keys are set up. Potpie checks for API keys in the following order:

1. **Environment Variables:**
- `LLM_API_KEY`
- `OPENAI_API_KEY`
- `{PROVIDER}_API_KEY` (e.g., `ANTHROPIC_API_KEY`)

2. **Secret Manager:**
Potpie can retrieve API keys from a secret management service. Ensure that the keys are stored correctly for the user.
74 changes: 74 additions & 0 deletions open-source/ollama-integration/ollama.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
---
title: "Using Potpie with Ollama Models"
description: 'Configure and run Potpie with Ollama models for local use.'
---

# Running Potpie with Ollama Models

In this guide, you'll learn how to configure and run Potpie with Ollama models on your local machine.

## Step 1: Install Ollama

Before using Ollama models with Potpie, you need to install the **Ollama CLI** tool. Ollama allows you to run models locally, and you can install it with the following commands:

### Installation:

1. Open a terminal and run the following command to download and install Ollama:
```bash
curl -fsSL https://ollama.com/install.sh | sh
```
2. Once installed, verify the installation by running:
```bash
ollama --version
```

## Step 2: Set Up Ollama Models

### Step 2.1: Pull the Required Models

To run Potpie with Ollama, you need to download the models you plan to use. In this guide, we pull two models commonly used for low and high reasoning tasks:

- **Low Reasoning Model:** Used for generating the knowledge graph.
- **High Reasoning Model:** Used for agent reasoning.

Run the following commands to pull the models:

```bash
# Pull the low reasoning model (used for knowledge graph generation)
ollama pull ollama_chat/qwen2.5-coder:7b

# Pull the high reasoning model (used for agent reasoning)
ollama pull ollama_chat/qwen2.5-coder:7b
```

Note that the models you pull should be in the `provider/model_name` format, or they should be compatible with the format expected by Litellm. For more model options and details, refer to the [Litellm documentation](https://docs.litellm.ai/).

## Step 3: Ollama API Key

You can retrieve your API Key following these steps:

1. **Sign in to Ollama**: Go to Ollama's official website and sign in to your account.
2. **Find Your API Key**: Navigate to the [ollama keys](https://ollama.com/settings/keys).
3. **Find the API Key Path**: You can now copy your API Key.


## Step 4: Configure Environment Variables

Once you have installed Ollama and pulled the models, you need to configure your environment to use these models with Potpie.

Open or create a `.env` file in the directory where you're running Potpie. Add the following configuration to specify the Ollama models:

```bash
# Set the LLM provider to Ollama
LLM_PROVIDER=ollama

# Set the API key
LLM_API_KEY=PASTE-YOUR-API-KEY-HERE

# Specify the model used for low reasoning (knowledge graph generation)
LOW_REASONING_MODEL=ollama_chat/qwen2.5-coder:7b

# Specify the model used for high reasoning (agent reasoning)
HIGH_REASONING_MODEL=ollama_chat/qwen2.5-coder:7b
```
All Set! Potpie will now use Local Ollama Models.
Loading