You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> **Note:** Remove previous versions < 0.1.x first if installed
35
+
> **Note:** Remove versions older than 0.1.x before installing
20
36
21
37
### Providers
22
38
23
-
The recommended approach is to sign up for claude pro or max and do `opencode auth login` and select Anthropic. It is the most cost-effective way to use this tool.
39
+
The recommended approach is to sign up for Claude Pro or Max, run `opencode auth login`, and select Anthropic. It's the most cost-effective way to use opencode.
24
40
25
-
Additionally, opencode is powered by the provider list at [models.dev](https://models.dev) so you can use `opencode auth login` to configure api keys for any provider you'd like to use. This is stored in `~/.local/share/opencode/auth.json`
41
+
opencode is powered by the provider list at [Models.dev](https://models.dev), so you can use `opencode auth login` to configure API keys for any provider you'd like to use. This is stored in `~/.local/share/opencode/auth.json`.
26
42
27
43
```bash
28
44
$ opencode auth login
@@ -41,13 +57,13 @@ $ opencode auth login
41
57
└
42
58
```
43
59
44
-
The models.dev dataset is also used to detect common environment variables like `OPENAI_API_KEY` to autoload that provider.
60
+
The Models.dev dataset is also used to detect common environment variables like `OPENAI_API_KEY` to autoload that provider.
45
61
46
-
If there are additional providers you want to use you can submit a PR to the [models.dev repo](https://github.com/sst/models.dev). If configuring just for yourself check out the Config section below
62
+
If there are additional providers you want to use you can submit a PR to the [Models.dev repo](https://github.com/sst/models.dev). If configuring just for yourself check out the Config section below.
47
63
48
64
### Project Config
49
65
50
-
Project configuration is optional. You can place an `opencode.json` file in the root of your repo, and it will be loaded.
66
+
Project configuration is optional. You can place an `opencode.json` file in the root of your repo, and it'll be loaded.
51
67
52
68
```json title="opencode.json"
53
69
{
@@ -78,14 +94,14 @@ Project configuration is optional. You can place an `opencode.json` file in the
78
94
79
95
#### Providers
80
96
81
-
You can use opencode with any provider listed at [here](https://ai-sdk.dev/providers/ai-sdk-providers). Use the npm package name as the key in your config. Note we use v5 of the ai-sdk and not all providers support that yet.
97
+
You can use opencode with any provider listed at [here](https://ai-sdk.dev/providers/ai-sdk-providers). Be sure to specify the npm package to use to load the provider.
82
98
83
99
```json title="opencode.json"
84
100
{
85
101
"$schema": "https://opencode.ai/config.json",
86
102
"provider": {
87
-
"@ai-sdk/openai-compatible": {
88
-
"name": "ollama",
103
+
"ollama": {
104
+
"npm": "@ai-sdk/openai-compatible",
89
105
"options": {
90
106
"baseURL": "http://localhost:11434/v1"
91
107
},
@@ -101,30 +117,31 @@ You can use opencode with any provider listed at [here](https://ai-sdk.dev/provi
101
117
102
118
### Contributing
103
119
104
-
To run opencode locally you need
120
+
To run opencode locally you need.
105
121
106
-
-bun
107
-
-golang 1.24.x
122
+
-Bun
123
+
-Golang 1.24.x
108
124
109
-
To run
125
+
To run.
110
126
111
-
```
127
+
```bash
112
128
$ bun install
113
129
$ cd packages/opencode
114
130
$ bun run src/index.ts
115
131
```
116
132
117
133
### FAQ
118
134
119
-
#### How do I use this with OpenRouter
135
+
#### How do I use this with OpenRouter?
120
136
121
-
OpenRouter is not yet in the models.dev database, but you can configure it manually.
137
+
OpenRouter is not in the Models.dev database yet, but you can configure it manually.
122
138
123
139
```json title="opencode.json"
124
140
{
125
141
"$schema": "https://opencode.ai/config.json",
126
142
"provider": {
127
-
"@openrouter/ai-sdk-provider": {
143
+
"openrouter": {
144
+
"npm": "@openrouter/ai-sdk-provider",
128
145
"name": "OpenRouter",
129
146
"options": {
130
147
"apiKey": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
@@ -139,15 +156,23 @@ OpenRouter is not yet in the models.dev database, but you can configure it manua
139
156
}
140
157
```
141
158
142
-
#### How is this different than claude code?
159
+
#### How is this different than Claude Code?
143
160
144
-
It is very similar to claude code in terms of capability - here are the key differences:
161
+
It's very similar to Claude Code in terms of capability. Here are the key differences:
145
162
146
163
- 100% open source
147
-
- Not coupled to any provider. Although anthropic is recommended opencode can be used with openai, google or even local models. As models evolve the gaps between them will close and pricing will drop so being provider agnostic is important.
148
-
- TUI focus - opencode is built by neovim users and the creators of https://terminal.shop - we are going to push the limits of what's possible in the terminal
149
-
- client/server architecture - this means the tui frontend is just the first of many. For example, opencode can run on your computer and you can drive it remotely from a mobile app
164
+
- Not coupled to any provider. Although Anthropic is recommended, opencode can be used with OpenAI, Google or even local models. As models evolve the gaps between them will close and pricing will drop so being provider agnostic is important.
165
+
- A focus on TUI. opencode is built by neovim users and the creators of [terminal.shop](https://terminal.shop); we are going to push the limits of what's possible in the terminal.
166
+
- A client/server architecture. This for example can allow opencode to run on your computer, while you can drive it remotely from a mobile app. Meaning that the TUI frontend is just one of the possible clients.
167
+
168
+
#### What about Windows support?
169
+
170
+
There are some minor problems blocking opencode from working on windows. We are working on on them now. You'll need to use WSL for now.
171
+
172
+
#### What's the other repo?
173
+
174
+
The other confusingly named repo has no relation to this one. You can [read the story behind it here](https://x.com/thdxr/status/1933561254481666466).
150
175
151
-
#### Windows Support
176
+
---
152
177
153
-
There are some minor problems blocking opencode from working on windows. We will fix them soon - would need to use wsl for now.
0 commit comments