Skip to content

Commit

Permalink
Update help docs (#8)
Browse files Browse the repository at this point in the history
* Update help guides and other in-UI wording

* Remove unused image

* fix markdown tw plugin

---------

Co-authored-by: Giuseppe Scuglia <[email protected]>
  • Loading branch information
danbarr and peppescg authored Dec 17, 2024
1 parent 251f700 commit 4bc3cbc
Show file tree
Hide file tree
Showing 15 changed files with 266 additions and 183 deletions.
45 changes: 45 additions & 0 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 2 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@
},
"devDependencies": {
"@eslint/js": "^9.15.0",
"@tailwindcss/typography": "^0.5.15",
"@types/node": "^22.10.1",
"@types/react": "^18.3.12",
"@types/react-dom": "^18.3.1",
Expand All @@ -56,4 +57,4 @@
"typescript-eslint": "^8.15.0",
"vite": "^6.0.1"
}
}
}
153 changes: 91 additions & 62 deletions public/help/continue-setup.md
Original file line number Diff line number Diff line change
@@ -1,103 +1,132 @@
# Continue Setup Guide
# Quick setup - Continue with VS Code

First off all, you will need to install the Continue Extension.
For complete documentation, see:

You can do this by running the following command:
- [Quickstart guide - Continue](https://docs.codegate.ai/quickstart-continue)
- [Use CodeGate with Continue](https://docs.codegate.ai/how-to/use-with-continue)

```bash
code --install-extension continue.continue
```
## Prerequisites

Alternatively, you can install the extension from the [Visual Studio Code Marketplace](https://marketplace.visualstudio.com/items?itemName=Continue.continue).
- Visual Studio Code
- Access to a supported AI model provider:
- Anthropic API
- OpenAI API
- A vLLM server in OpenAI-compatible mode
- Ollama running locally

## Install the Continue extension

Once you have installed the extension, you should be able to see the Continue icon in the Activity Bar.
The Continue extension is available in the
[Visual Studio Marketplace](https://marketplace.visualstudio.com/items?itemName=Continue.continue).

![Continue Icon](./images/continue.png)
Install the extension using the **Install** link on the Marketplace page or search
for "Continue" in the Extensions panel within VS Code.

## Steps to Complete Setup
You can also install from the CLI:

### 1. Configure Continue
```bash
code --install-extension Continue.continue
```

Within VSCode open the command palette and run the `Continue: New Sesstion`
Once you have installed the extension, you should be able to see the Continue
icon in the Activity Bar.

This will bring up the Continue chat window.
![Continue icon](./images/continue-extension-light.webp)

Select the cog icon in the top right corner to open the settings.
## Configure Continue to use CodeGate

Configure your LLM provider as per normal with Continue, but change the `apiBase`
value as follows:
To configure Continue to send requests through CodeGate:

```json
{
"apiBase": "http://localhost:8989/openai",
}
}
```
1. Configure the [chat](https://docs.continue.dev/chat/model-setup) and [autocomplete](https://docs.continue.dev/autocomplete/model-setup) settings in Continue for your desired AI model(s).

For example, to configure the Anthropic provider, you would use the following configuration:
2. Open the Continue [configuration file](https://docs.continue.dev/reference), "~/.continue/config.json". You can edit this file directly or access it from the gear icon ("Configure Continue") in the Continue chat interface. ![Continue extension settings](./images/continue-config-light.webp)

```json
{
"title": "anthropic claude-3-5-sonnet",
"provider": "anthropic",
"model": "claude-3-5-sonnet-20241022",
"apiKey": "yourkey",
"apiBase": "http://localhost:8989/anthropic"
},
```
3. Add the "apiBase" property to the "models" entry (chat) and
"tabAutocompleteModel" (autocomplete) sections of the configuration file.
This tells Continue to use the CodeGate CodeGate container running locally on
your system as the base URL for your LLM API, instead of the default.

```json
"apiBase": "http://127.0.0.1:8989/PROVIDER"
```

Replace /PROVIDER with one of: /anthropic, /ollama, /openai, or /vllm to
match your LLM provider.

4. Save the configuration file.

### Examples

The same follows for OpenAI, Ollama, vLLM and any other provider you wish to use.
Example Continue chat configurations for Anthropic, OpenAI, Ollama, and vLLM:

```json
"models": [
{
"title": "vllm-qwen2.5-coder-14b-instruct",
"provider": "vllm",
"model": "Qwen/Qwen2.5-Coder-14B-Instruct",
"apiKey": "key",
"apiBase": "http://localhost:8989/vllm"
},
{
"title": "openai",
"provider": "openrouter",
"model": "gpt-4o-2024-11-20",
"apiBase": "http://localhost:8989/openai",
"apiKey": "key"
},
{
"title": "anthropic claude-3-5-sonnet",
"title": "CodeGate-Anthropic",
"provider": "anthropic",
"model": "claude-3-5-sonnet-20241022",
"apiKey": "key",
"model": "claude-3-5-sonnet-latest",
"apiKey": "YOUR_API_KEY",
"apiBase": "http://localhost:8989/anthropic"
},
{
"title": "ollama qwen2.5-coder-7b-instruct",
"title": "CodeGate-OpenAI",
"provider": "openai",
"model": "gpt-4o",
"apiKey": "YOUR_API_KEY",
"apiBase": "http://localhost:8989/openai"
},
{
"title": "CodeGate-Ollama",
"provider": "ollama",
"model": "sammcj/qwen2.5-coder-7b-instruct:q8_0",
"model": "codellama:7b-instruct",
"apiBase": "http://localhost:8989/ollama"
},
{
"title": "CodeGate-vLLM",
"provider": "vllm",
"model": "Qwen/Qwen2.5-Coder-14B-Instruct",
"apiKey": "YOUR_API_KEY",
"apiBase": "http://localhost:8989/vllm"
}
],
```

For auto completion, you can add the following to your settings.json file:
For auto completion, add your model config to the tabAutoCompleteModel section
of the config.json file. Example for Anthropic:

```json
"tabAutocompleteModel": {
"title": "ollama",
"provider": "ollama",
"model": "codellama:7b-code",
"apiBase": "http://127.0.0.1:8989/ollama"
"title": "CodeGate-Anthropic",
"provider": "anthropic",
"model": "claude-3-5-sonnet-latest",
"apiKey": "YOUR_API_KEY",
"apiBase": "http://localhost:8989/anthropic"
},
```

You can now start using Continue as before, but with the added benefit
extra privacy and control over your data.
For more details, refer to the full
[CodeGate how-to guide for Continue](https://docs.codegate.ai/how-to/use-with-continue#configure-continue-to-use-codegate).

![Continue Window](./images/continue-two.png)
## Verify configuration

## Support
To verify that you've successfully connected Continue to CodeGate, open the
Continue chat and type "codegate-version". You should receive a response like
"CodeGate version 0.1.0".

You can now start using Continue as before, but with the added benefit extra
privacy and control over your data.

Any issuess , please ask for support on the Continue [CodeGate Discussions](https://github.com/stacklok/codegate/discussions/categories/continue) page.
![Continue chat](./images/continue-chat.png)

## Next steps

Explore the full [CodeGate docs](https://docs.codegate.ai), join the
[community Discord server](https://discord.gg/stacklok) to chat about the
project, and get involved on the
[GitHub repo](https://github.com/stacklok/codegate)!

## Support

If you need help, please ask for support on the Continue section of
[CodeGate discussions](https://github.com/stacklok/codegate/discussions/categories/continue)
or in the #codegate channel on [Discord](https://discord.gg/stacklok).
Loading

0 comments on commit 4bc3cbc

Please sign in to comment.