@@ -6,8 +6,8 @@ import TabItem from '@theme/TabItem';
6
6
7
7
You need an [ OpenAI API] ( https://openai.com/api/ ) account to use this provider.
8
8
9
- Before you run Aider , set environment variables for your API key and to set the
10
- API base URL to CodeGate's API port. Alternately, use one of Aider 's other
9
+ Before you run aider , set environment variables for your API key and to set the
10
+ API base URL to CodeGate's API port. Alternately, use one of aider 's other
11
11
[ supported configuration methods] ( https://aider.chat/docs/config/api-keys.html )
12
12
to set the corresponding values.
13
13
@@ -47,7 +47,7 @@ Replace `<YOUR_API_KEY>` with your
47
47
[ OpenAI API key] ( https://platform.openai.com/api-keys ) .
48
48
49
49
Then run ` aider ` as normal. For more information, see the
50
- [ Aider docs for connecting to OpenAI] ( https://aider.chat/docs/llms/openai.html ) .
50
+ [ aider docs for connecting to OpenAI] ( https://aider.chat/docs/llms/openai.html ) .
51
51
52
52
</TabItem >
53
53
<TabItem value = " ollama" label = " Ollama" >
@@ -60,8 +60,8 @@ changed the default Ollama server port or to connect to a remote Ollama
60
60
instance, launch CodeGate with the ` CODEGATE_OLLAMA_URL ` environment variable
61
61
set to the correct URL. See [ Configure CodeGate] ( /how-to/configure.md ) .
62
62
63
- Before you run Aider , set the Ollama base URL to CodeGate's API port using an
64
- environment variable. Alternately, use one of Aider 's other
63
+ Before you run aider , set the Ollama base URL to CodeGate's API port using an
64
+ environment variable. Alternately, use one of aider 's other
65
65
[ supported configuration methods] ( https://aider.chat/docs/config/api-keys.html )
66
66
to set the corresponding values.
67
67
@@ -75,7 +75,7 @@ export OLLAMA_API_BASE=http://localhost:8989/ollama
75
75
:::note
76
76
77
77
To persist this setting, add it to your shell profile (e.g., ` ~/.bashrc ` or
78
- ` ~/.zshrc ` ) or use one of Aider 's other
78
+ ` ~/.zshrc ` ) or use one of's aider 's other
79
79
[ supported configuration methods] ( https://aider.chat/docs/config/api-keys.html ) .
80
80
81
81
:::
@@ -96,7 +96,7 @@ Restart your shell after running `setx`.
96
96
</TabItem >
97
97
</Tabs >
98
98
99
- Then run Aider :
99
+ Then run aider :
100
100
101
101
``` bash
102
102
aider --model ollama_chat/< MODEL_NAME>
@@ -114,7 +114,7 @@ CPU cores and 16GB of RAM. If you have more compute resources available, our
114
114
experimentation shows that larger models do yield better results.
115
115
116
116
For more information, see the
117
- [ Aider docs for connecting to Ollama] ( https://aider.chat/docs/llms/ollama.html ) .
117
+ [ aider docs for connecting to Ollama] ( https://aider.chat/docs/llms/ollama.html ) .
118
118
119
119
</TabItem >
120
120
</Tabs >
0 commit comments