Skip to content

Commit 63b21a2

Browse files
authored
Cline docs (#50)
1 parent c342a32 commit 63b21a2

24 files changed

+295
-21
lines changed

docs/about/changelog.md

+5
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,11 @@ Major features and changes are noted here. To review all updates, see the
1313

1414
Related: [Upgrade CodeGate](../how-to/install.md#upgrade-codegate)
1515

16+
- **Cline support** - 28 Jan, 2025\
17+
CodeGate version 0.1.14 adds support for
18+
[Cline](https://github.com/cline/cline) with Anthropic, OpenAI, Ollama, and LM
19+
Studio. See the [how-to guide](../how-to/use-with-cline.mdx) to learn more.
20+
1621
- **Workspaces** - 22 Jan, 2025\
1722
Now available in CodeGate v0.1.12, workspaces help you organize and customize
1823
your AI-assisted development. Learn more in

docs/how-to/configure.md

+9-8
Original file line numberDiff line numberDiff line change
@@ -20,14 +20,15 @@ docker run --name codegate -d -p 8989:8989 -p 9090:9090 \
2020

2121
CodeGate supports the following parameters:
2222

23-
| Parameter | Default value | Description |
24-
| :----------------------- | :---------------------------------- | :------------------------------------------------------------------------------------------------------------ |
25-
| `CODEGATE_OLLAMA_URL` | `http://host.docker.internal:11434` | Specifies the URL of an Ollama instance. Used when the provider in your plugin config is `ollama`. |
26-
| `CODEGATE_VLLM_URL` | `http://localhost:8000` | Specifies the URL of a model hosted by a vLLM server. Used when the provider in your plugin config is `vllm`. |
27-
| `CODEGATE_ANTHROPIC_URL` | `https://api.anthropic.com/v1` | Specifies the Anthropic engine API endpoint URL. |
28-
| `CODEGATE_OPENAI_URL` | `https://api.openai.com/v1` | Specifies the OpenAI engine API endpoint URL. |
29-
| `CODEGATE_APP_LOG_LEVEL` | `WARNING` | Sets the logging level. Valid values: `ERROR`, `WARNING`, `INFO`, `DEBUG` (case sensitive) |
30-
| `CODEGATE_LOG_FORMAT` | `TEXT` | Type of log formatting. Valid values: `TEXT`, `JSON` (case sensitive) |
23+
| Parameter | Default value | Description |
24+
| :----------------------- | :---------------------------------- | :----------------------------------------------------------------------------------------- |
25+
| `CODEGATE_APP_LOG_LEVEL` | `WARNING` | Sets the logging level. Valid values: `ERROR`, `WARNING`, `INFO`, `DEBUG` (case sensitive) |
26+
| `CODEGATE_LOG_FORMAT` | `TEXT` | Type of log formatting. Valid values: `TEXT`, `JSON` (case sensitive) |
27+
| `CODEGATE_ANTHROPIC_URL` | `https://api.anthropic.com/v1` | Specifies the Anthropic engine API endpoint URL. |
28+
| `CODEGATE_LM_STUDIO_URL` | `http://host.docker.internal:1234` | Specifies the URL of your LM Studio server. |
29+
| `CODEGATE_OLLAMA_URL` | `http://host.docker.internal:11434` | Specifies the URL of your Ollama instance. |
30+
| `CODEGATE_OPENAI_URL` | `https://api.openai.com/v1` | Specifies the OpenAI engine API endpoint URL. |
31+
| `CODEGATE_VLLM_URL` | `http://localhost:8000` | Specifies the URL of the vLLM server to use. |
3132

3233
## Example: Use CodeGate with OpenRouter
3334

docs/how-to/install.md

+6-3
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,8 @@ application settings, see [Configure CodeGate](./configure.md)
4242

4343
### Alternative run commands {#examples}
4444

45-
Run with minimal functionality for use with **Continue** or **aider**:
45+
Run with minimal functionality for use with **Continue**, **aider**, or
46+
**Cline**:
4647

4748
```bash
4849
docker run -d -p 8989:8989 -p 9090:9090 --restart unless-stopped ghcr.io/stacklok/codegate:latest
@@ -150,15 +151,17 @@ persistent volume.
150151

151152
Now that CodeGate is running, proceed to configure your IDE integration.
152153

153-
- [Use CodeGate with GitHub Copilot](./use-with-copilot.mdx)
154154
- [Use CodeGate with aider](./use-with-aider.mdx)
155+
- [Use CodeGate with Cline](./use-with-cline.mdx)
155156
- [Use CodeGate with Continue](./use-with-continue.mdx)
157+
- [Use CodeGate with GitHub Copilot](./use-with-copilot.mdx)
156158

157159
## Remove CodeGate
158160

159161
If you decide to stop using CodeGate, follow the removal steps for your IDE
160162
integration:
161163

162-
- [Remove CodeGate - GitHub Copilot](./use-with-copilot.mdx#remove-codegate)
163164
- [Remove CodeGate - aider](./use-with-aider.mdx#remove-codegate)
165+
- [Remove CodeGate - Cline](./use-with-cline.mdx#remove-codegate)
164166
- [Remove CodeGate - Continue](./use-with-continue.mdx#remove-codegate)
167+
- [Remove CodeGate - GitHub Copilot](./use-with-copilot.mdx#remove-codegate)

docs/how-to/use-with-aider.mdx

+1-1
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ CodeGate works with the following AI model providers through aider:
1515
- Local / self-managed:
1616
- [Ollama](https://ollama.com/)
1717
- Hosted:
18-
- [OpenAI](https://openai.com/api/)
18+
- [OpenAI](https://openai.com/api/) and OpenAI-compatible APIs
1919

2020
:::note
2121

docs/how-to/use-with-cline.mdx

+128
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,128 @@
1+
---
2+
title: Use CodeGate with Cline
3+
description: Configure the Cline IDE extension
4+
sidebar_label: Use with Cline
5+
sidebar_position: 90
6+
---
7+
8+
import useBaseUrl from '@docusaurus/useBaseUrl';
9+
import ThemedImage from '@theme/ThemedImage';
10+
11+
[Cline](https://github.com/cline/cline) is an autonomous coding agent for Visual
12+
Studio Code that supports numerous API providers and models.
13+
14+
CodeGate works with the following AI model providers through Cline:
15+
16+
- Local / self-managed:
17+
- [Ollama](https://ollama.com/)
18+
- [LM Studio](https://lmstudio.ai/)
19+
- Hosted:
20+
- [Anthropic](https://www.anthropic.com/api)
21+
- [OpenAI](https://openai.com/api/) and OpenAI-compatible APIs
22+
23+
## Install the Cline extension
24+
25+
The Cline extension is available in the
26+
[Visual Studio Marketplace](https://marketplace.visualstudio.com/items?itemName=saoudrizwan.claude-dev).
27+
28+
Install the extension using the **Install** link on the Marketplace page or
29+
search for "Cline" in the Extensions panel within VS Code.
30+
31+
You can also install from the CLI:
32+
33+
```bash
34+
code --install-extension saoudrizwan.claude-dev
35+
```
36+
37+
If you need help, see
38+
[Managing Extensions](https://code.visualstudio.com/docs/editor/extension-marketplace)
39+
in the VS Code documentation.
40+
41+
## Configure Cline to use CodeGate
42+
43+
import ClineProviders from '../partials/_cline-providers.mdx';
44+
45+
To configure Cline to send requests through CodeGate:
46+
47+
1. Open the Cline extension sidebar from the VS Code Activity Bar and open its
48+
settings using the gear icon.
49+
50+
<ThemedImage
51+
alt='Cline extension settings'
52+
sources={{
53+
light: useBaseUrl('/img/how-to/cline-settings-light.webp'),
54+
dark: useBaseUrl('/img/how-to/cline-settings-dark.webp'),
55+
}}
56+
width={'540px'}
57+
/>
58+
59+
1. Select your provider and configure as detailed here:
60+
61+
<ClineProviders />
62+
63+
1. Click **Done** to save the settings.
64+
65+
## Verify configuration
66+
67+
To verify that you've successfully connected Cline to CodeGate, open the Cline
68+
sidebar and type `codegate version`. You should receive a response like
69+
"CodeGate version 0.1.14":
70+
71+
<ThemedImage
72+
alt='Cline verification'
73+
sources={{
74+
light: useBaseUrl('/img/how-to/cline-codegate-version-light.webp'),
75+
dark: useBaseUrl('/img/how-to/cline-codegate-version-dark.webp'),
76+
}}
77+
width={'490px'}
78+
/>
79+
80+
Try asking CodeGate about a known malicious Python package:
81+
82+
```plain title="Cline chat"
83+
Tell me how to use the invokehttp package from PyPI
84+
```
85+
86+
CodeGate responds with a warning and a link to the Stacklok Insight report about
87+
this package:
88+
89+
```plain title="Cline chat"
90+
Warning: CodeGate detected one or more malicious, deprecated or archived packages.
91+
92+
• invokehttp: https://www.insight.stacklok.com/report/pypi/invokehttp
93+
94+
The `invokehttp` package from PyPI has been identified as malicious and should
95+
not be used. Please avoid using this package and consider using a trusted
96+
alternative such as `requests` for making HTTP requests in Python.
97+
98+
Here is an example of how to use the `requests` package:
99+
100+
...
101+
```
102+
103+
## Next steps
104+
105+
Learn more about CodeGate's features and how to use them:
106+
107+
- [Access the dashboard](./dashboard.md)
108+
- [CodeGate features](../features/index.mdx)
109+
110+
## Remove CodeGate
111+
112+
If you decide to stop using CodeGate, follow these steps to remove it and revert
113+
your environment.
114+
115+
1. Remove the custom base URL from your Cline provider settings.
116+
117+
1. Stop and remove the CodeGate container:
118+
119+
```bash
120+
docker stop codegate && docker rm codegate
121+
```
122+
123+
1. If you launched CodeGate with a persistent volume, delete it to remove the
124+
CodeGate database and other files:
125+
126+
```bash
127+
docker volume rm codegate_volume
128+
```

docs/index.md

+7-1
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,13 @@ AI coding assistants / IDEs:
5454
- Anthropic
5555
- OpenAI
5656

57-
- **[Aider](./how-to/use-with-aider.mdx)** with Ollama and OpenAI
57+
- **[Aider](./how-to/use-with-aider.mdx)** with Ollama and OpenAI-compatible
58+
APIs
59+
60+
- **[Cline](./how-to/use-with-cline.mdx)** with Visual Studio Code
61+
62+
CodeGate supports Ollama, Anthropic, OpenAI-compatible APIs, and LM Studio
63+
with Cline.
5864

5965
As the project evolves, we plan to add support for more IDE assistants and AI
6066
model providers.

docs/partials/.markdownlint.json

+3
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
{
2+
"first-line-h1": false
3+
}

docs/partials/_aider-providers.mdx

+6-8
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,14 @@
11
import Tabs from '@theme/Tabs';
22
import TabItem from '@theme/TabItem';
33

4+
import LocalModelRecommendation from './_local-model-recommendation.md';
5+
46
<Tabs groupId="aider-provider">
57
<TabItem value="openai" label="OpenAI" default>
68

79
You need an [OpenAI API](https://openai.com/api/) account to use this provider.
10+
To use a different OpenAI-compatible endpoint, set the `CODEGATE_OPENAI_URL`
11+
[configuration parameter](../how-to/configure.md#config-parameters).
812

913
Before you run aider, set environment variables for your API key and to set the
1014
API base URL to CodeGate's API port. Alternately, use one of aider's other
@@ -58,7 +62,7 @@ You need Ollama installed on your local system with the server running
5862
CodeGate connects to `http://host.docker.internal:11434` by default. If you
5963
changed the default Ollama server port or to connect to a remote Ollama
6064
instance, launch CodeGate with the `CODEGATE_OLLAMA_URL` environment variable
61-
set to the correct URL. See [Configure CodeGate](/how-to/configure.md).
65+
set to the correct URL. See [Configure CodeGate](../how-to/configure.md).
6266

6367
Before you run aider, set the Ollama base URL to CodeGate's API port using an
6468
environment variable. Alternately, use one of aider's other
@@ -105,13 +109,7 @@ aider --model ollama_chat/<MODEL_NAME>
105109
Replace `<MODEL_NAME>` with the name of a coding model you have installed
106110
locally using `ollama pull`.
107111

108-
We recommend the [Qwen2.5-Coder](https://ollama.com/library/qwen2.5-coder)
109-
series of models. Our minimum recommendation for quality results is the 7
110-
billion parameter (7B) version, `qwen2.5-coder:7b`.
111-
112-
This model balances performance and quality for typical systems with at least 4
113-
CPU cores and 16GB of RAM. If you have more compute resources available, our
114-
experimentation shows that larger models do yield better results.
112+
<LocalModelRecommendation />
115113

116114
For more information, see the
117115
[aider docs for connecting to Ollama](https://aider.chat/docs/llms/ollama.html).

docs/partials/_cline-providers.mdx

+124
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,124 @@
1+
import Tabs from '@theme/Tabs';
2+
import TabItem from '@theme/TabItem';
3+
import useBaseUrl from '@docusaurus/useBaseUrl';
4+
import ThemedImage from '@theme/ThemedImage';
5+
6+
import LocalModelRecommendation from './_local-model-recommendation.md';
7+
8+
<Tabs groupId="cline-provider">
9+
<TabItem value="anthropic" label="Anthropic" default>
10+
11+
You need an [Anthropic API](https://www.anthropic.com/api) account to use this
12+
provider.
13+
14+
In the Cline settings, choose **Anthropic** as your provider, enter your
15+
Anthropic API key, and choose your preferred model (we recommend
16+
`claude-3-5-sonnet-<latest>`).
17+
18+
To enable CodeGate, enable **Use custom base URL** and enter
19+
`https://localhost:8989/anthropic`.
20+
21+
<ThemedImage
22+
alt='Cline settings for Anthropic'
23+
sources={{
24+
light: useBaseUrl('/img/how-to/cline-provider-anthropic-light.webp'),
25+
dark: useBaseUrl('/img/how-to/cline-provider-anthropic-dark.webp'),
26+
}}
27+
width={'540px'}
28+
/>
29+
30+
</TabItem>
31+
<TabItem value="openai" label="OpenAI">
32+
33+
You need an [OpenAI API](https://openai.com/api/) account to use this provider.
34+
To use a different OpenAI-compatible endpoint, set the `CODEGATE_OPENAI_URL`
35+
[configuration parameter](../how-to/configure.md) when you launch CodeGate.
36+
37+
In the Cline settings, choose **OpenAI Compatible** as your provider, enter your
38+
OpenAI API key, and set your preferred model (example: `gpt-4o-mini`).
39+
40+
To enable CodeGate, set the **Base URL** to `https://localhost:8989/openai`.
41+
42+
<ThemedImage
43+
alt='Cline settings for OpenAI'
44+
sources={{
45+
light: useBaseUrl('/img/how-to/cline-provider-openai-light.webp'),
46+
dark: useBaseUrl('/img/how-to/cline-provider-openai-dark.webp'),
47+
}}
48+
width={'540px'}
49+
/>
50+
51+
</TabItem>
52+
<TabItem value="ollama" label="Ollama">
53+
54+
You need Ollama installed on your local system with the server running
55+
(`ollama serve`) to use this provider.
56+
57+
CodeGate connects to `http://host.docker.internal:11434` by default. If you
58+
changed the default Ollama server port or to connect to a remote Ollama
59+
instance, launch CodeGate with the `CODEGATE_OLLAMA_URL` environment variable
60+
set to the correct URL. See [Configure CodeGate](/how-to/configure.md).
61+
62+
In the Cline settings, choose **Ollama** as your provider and set the **Base
63+
URL** to `http://localhost:8989/ollama`.
64+
65+
For the **Model ID**, provide the name of a coding model you have installed
66+
locally using `ollama pull`.
67+
68+
<LocalModelRecommendation />
69+
70+
<ThemedImage
71+
alt='Cline settings for Ollama'
72+
sources={{
73+
light: useBaseUrl('/img/how-to/cline-provider-ollama-light.webp'),
74+
dark: useBaseUrl('/img/how-to/cline-provider-ollama-dark.webp'),
75+
}}
76+
width={'540px'}
77+
/>
78+
79+
</TabItem>
80+
<TabItem value="lmstudio" label="LM Studio">
81+
82+
You need LM Studio installed on your local system with a server running from LM
83+
Studio's **Developer** tab to use this provider. See the
84+
[LM Studio docs](https://lmstudio.ai/docs/api/server) for more information.
85+
86+
Cline uses large prompts, so you will likely need to increase the context length
87+
for the model you've loaded in LM Studio. In the Developer tab, select the model
88+
you'll use with CodeGate, open the **Load** tab on the right and increase the
89+
**Context Length** to _at least_ 18k (18,432) tokens, then reload the model.
90+
91+
<ThemedImage
92+
alt='LM Studio dev server'
93+
sources={{
94+
light: useBaseUrl('/img/how-to/lmstudio-server-light.webp'),
95+
dark: useBaseUrl('/img/how-to/lmstudio-server-dark.webp'),
96+
}}
97+
width={'800px'}
98+
/>
99+
100+
CodeGate connects to `http://host.docker.internal:1234` by default. If you
101+
changed the default LM Studio server port, launch CodeGate with the
102+
`CODEGATE_LM_STUDIO_URL` environment variable set to the correct URL. See
103+
[Configure CodeGate](/how-to/configure.md).
104+
105+
In the Cline settings, choose LM Studio as your provider and set the **Base
106+
URL** to `http://localhost:8989/openai`.
107+
108+
Set the **Model ID** to `lm_studio/<MODEL_NAME>`, where `<MODEL_NAME>` is the
109+
name of the model you're serving through LM Studio (shown in the Developer tab),
110+
for example `lm_studio/qwen2.5-coder-7b-instruct`.
111+
112+
<LocalModelRecommendation />
113+
114+
<ThemedImage
115+
alt='Cline settings for LM Studio'
116+
sources={{
117+
light: useBaseUrl('/img/how-to/cline-provider-lmstudio-light.webp'),
118+
dark: useBaseUrl('/img/how-to/cline-provider-lmstudio-dark.webp'),
119+
}}
120+
width={'635px'}
121+
/>
122+
123+
</TabItem>
124+
</Tabs>
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
We recommend the [Qwen2.5-Coder](https://ollama.com/library/qwen2.5-coder)
2+
series of models. Our minimum recommendation for quality results is the 7
3+
billion parameter (7B) version, `qwen2.5-coder:7b-instruct`. This model balances
4+
performance and quality for systems with at least 4 CPU cores and 16GB of RAM.
5+
If you have more compute resources available, our experimentation shows that
6+
larger models do yield better results.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
41.9 KB
Binary file not shown.
44.7 KB
Binary file not shown.
91.2 KB
Binary file not shown.
97.1 KB
Binary file not shown.

0 commit comments

Comments
 (0)