Skip to content

Commit b167675

Browse files
authored
package updates (#871)
2 parents f0a5c12 + d560352 commit b167675

9 files changed

+1857
-1655
lines changed

.env.example

+3
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,6 @@
1+
# Get your xAI API Key here for chat models: https://console.x.ai/
2+
XAI_API_KEY=****
3+
14
# Get your OpenAI API Key here for chat models: https://platform.openai.com/account/api-keys
25
OPENAI_API_KEY=****
36

README.md

+4-4
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@
2323
- [AI SDK](https://sdk.vercel.ai/docs)
2424
- Unified API for generating text, structured objects, and tool calls with LLMs
2525
- Hooks for building dynamic chat and generative user interfaces
26-
- Supports OpenAI (default), Anthropic, Cohere, and other model providers
26+
- Supports xAI (default), OpenAI, Fireworks, and other model providers
2727
- [shadcn/ui](https://ui.shadcn.com)
2828
- Styling with [Tailwind CSS](https://tailwindcss.com)
2929
- Component primitives from [Radix UI](https://radix-ui.com) for accessibility and flexibility
@@ -35,19 +35,19 @@
3535

3636
## Model Providers
3737

38-
This template ships with OpenAI `gpt-4o` as the default. However, with the [AI SDK](https://sdk.vercel.ai/docs), you can switch LLM providers to [OpenAI](https://openai.com), [Anthropic](https://anthropic.com), [Cohere](https://cohere.com/), and [many more](https://sdk.vercel.ai/providers/ai-sdk-providers) with just a few lines of code.
38+
This template ships with [xAI](https://x.ai) `grok-2-1212` as the default chat model. However, with the [AI SDK](https://sdk.vercel.ai/docs), you can switch LLM providers to [OpenAI](https://openai.com), [Anthropic](https://anthropic.com), [Cohere](https://cohere.com/), and [many more](https://sdk.vercel.ai/providers/ai-sdk-providers) with just a few lines of code.
3939

4040
## Deploy Your Own
4141

4242
You can deploy your own version of the Next.js AI Chatbot to Vercel with one click:
4343

44-
[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot&env=AUTH_SECRET,OPENAI_API_KEY&envDescription=Learn%20more%20about%20how%20to%20get%20the%20API%20Keys%20for%20the%20application&envLink=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&demo-title=AI%20Chatbot&demo-description=An%20Open-Source%20AI%20Chatbot%20Template%20Built%20With%20Next.js%20and%20the%20AI%20SDK%20by%20Vercel.&demo-url=https%3A%2F%2Fchat.vercel.ai&stores=[{%22type%22:%22postgres%22},{%22type%22:%22blob%22}])
44+
[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot&env=AUTH_SECRET,OPENAI_API_KEY,XAI_API_KEY,FIREWORKS_API_KEY&envDescription=Learn%20more%20about%20how%20to%20get%20the%20API%20Keys%20for%20the%20application&envLink=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&demo-title=AI%20Chatbot&demo-description=An%20Open-Source%20AI%20Chatbot%20Template%20Built%20With%20Next.js%20and%20the%20AI%20SDK%20by%20Vercel.&demo-url=https%3A%2F%2Fchat.vercel.ai&stores=[{%22type%22:%22postgres%22},{%22type%22:%22blob%22}])
4545

4646
## Running locally
4747

4848
You will need to use the environment variables [defined in `.env.example`](.env.example) to run Next.js AI Chatbot. It's recommended you use [Vercel Environment Variables](https://vercel.com/docs/projects/environment-variables) for this, but a `.env` file is all that is necessary.
4949

50-
> Note: You should not commit your `.env` file or it will expose secrets that will allow others to control access to your various OpenAI and authentication provider accounts.
50+
> Note: You should not commit your `.env` file or it will expose secrets that will allow others to control access to your various AI and authentication provider accounts.
5151
5252
1. Install Vercel CLI: `npm i -g vercel`
5353
2. Link local instance with Vercel and GitHub accounts (creates `.vercel` directory): `vercel link`

components/chat-header.tsx

+1-1
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,7 @@ function PureChatHeader({
7272
asChild
7373
>
7474
<Link
75-
href="https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot&env=AUTH_SECRET,OPENAI_API_KEY&envDescription=Learn%20more%20about%20how%20to%20get%20the%20API%20Keys%20for%20the%20application&envLink=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&demo-title=AI%20Chatbot&demo-description=An%20Open-Source%20AI%20Chatbot%20Template%20Built%20With%20Next.js%20and%20the%20AI%20SDK%20by%20Vercel.&demo-url=https%3A%2F%2Fchat.vercel.ai&stores=%5B%7B%22type%22:%22postgres%22%7D,%7B%22type%22:%22blob%22%7D%5D"
75+
href="https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot&env=AUTH_SECRET,OPENAI_API_KEY,XAI_API_KEY,FIREWORKS_API_KEY&envDescription=Learn%20more%20about%20how%20to%20get%20the%20API%20Keys%20for%20the%20application&envLink=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&demo-title=AI%20Chatbot&demo-description=An%20Open-Source%20AI%20Chatbot%20Template%20Built%20With%20Next.js%20and%20the%20AI%20SDK%20by%20Vercel.&demo-url=https%3A%2F%2Fchat.vercel.ai&stores=%5B%7B%22type%22:%22postgres%22%7D,%7B%22type%22:%22blob%22%7D%5D"
7676
target="_noblank"
7777
>
7878
<VercelIcon size={16} />

docs/01-quick-start.md

+5-2
Original file line numberDiff line numberDiff line change
@@ -8,11 +8,14 @@ Deploying to [Vercel](https://vercel.com) is the quickest way to get started wit
88

99
- Vercel account and [Vercel CLI](https://vercel.com/docs/cli)
1010
- GitHub/GitLab/Bitbucket account
11-
- API Key from [OpenAI](https://platform.openai.com)
11+
- API Keys from three AI model providers:
12+
- [xAI](https://console.x.ai/)
13+
- [OpenAI](https://platform.openai.com/account/api-keys)
14+
- [Fireworks](https://fireworks.ai/account/api-keys)
1215

1316
### Deploy to Vercel
1417

15-
To deploy the chatbot template to Vercel, click this [link](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot&env=AUTH_SECRET,OPENAI_API_KEY&envDescription=Learn%20more%20about%20how%20to%20get%20the%20API%20Keys%20for%20the%20application&envLink=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&demo-title=AI%20Chatbot&demo-description=An%20Open-Source%20AI%20Chatbot%20Template%20Built%20With%20Next.js%20and%20the%20AI%20SDK%20by%20Vercel.&demo-url=https%3A%2F%2Fchat.vercel.ai&stores=%5B%7B%22type%22:%22postgres%22%7D,%7B%22type%22:%22blob%22%7D%5D) to enter the 1-click deploy flow.
18+
To deploy the chatbot template to Vercel, click this [link](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot&env=AUTH_SECRET,OPENAI_API_KEY,XAI_API_KEY,FIREWORKS_API_KEY&envDescription=Learn%20more%20about%20how%20to%20get%20the%20API%20Keys%20for%20the%20application&envLink=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&demo-title=AI%20Chatbot&demo-description=An%20Open-Source%20AI%20Chatbot%20Template%20Built%20With%20Next.js%20and%20the%20AI%20SDK%20by%20Vercel.&demo-url=https%3A%2F%2Fchat.vercel.ai&stores=%5B%7B%22type%22:%22postgres%22%7D,%7B%22type%22:%22blob%22%7D%5D) to enter the 1-click deploy flow.
1619

1720
During the flow, you will be prompted to create and connect to a postgres database and blob store. You will also need to provide environment variables for the application.
1821

docs/02-update-models.md

+10-11
Original file line numberDiff line numberDiff line change
@@ -1,42 +1,41 @@
11
# Update Models
22

3-
The chatbot template ships with [OpenAI](https://sdk.vercel.ai/providers/ai-sdk-providers/openai) as the default model provider. Since the template is powered by the [AI SDK](https://sdk.vercel.ai), which supports [multiple providers](https://sdk.vercel.ai/providers/ai-sdk-providers) out of the box, you can easily switch to another provider of your choice.
3+
The chatbot template ships with [xAI](https://sdk.vercel.ai/providers/ai-sdk-providers/xai) as the default model provider. Since the template is powered by the [AI SDK](https://sdk.vercel.ai), which supports [multiple providers](https://sdk.vercel.ai/providers/ai-sdk-providers) out of the box, you can easily switch to another provider of your choice.
44

55
To update the models, you will need to update the custom provider called `myProvider` at `/lib/ai/models.ts` shown below.
66

77
```ts
88
import { customProvider } from "ai";
9-
import { openai } from "@ai-sdk/openai";
9+
import { xai } from "@ai-sdk/xai";
1010

1111
export const myProvider = customProvider({
1212
languageModels: {
13-
"chat-model-small": openai("gpt-4o-mini"),
14-
"chat-model-large": openai("gpt-4o"),
13+
"chat-model": xai("grok-2-1212"),
1514
"chat-model-reasoning": wrapLanguageModel({
1615
model: fireworks("accounts/fireworks/models/deepseek-r1"),
1716
middleware: extractReasoningMiddleware({ tagName: "think" }),
1817
}),
19-
"title-model": openai("gpt-4-turbo"),
20-
"artifact-model": openai("gpt-4o-mini"),
18+
"title-model": xai("grok-2-1212"),
19+
"artifact-model": xai("grok-2-1212"),
2120
},
2221
imageModels: {
23-
"small-model": openai.image("dall-e-3"),
22+
"small-model": openai.image("dall-e-2"),
23+
"large-model": openai.image("dall-e-3"),
2424
},
2525
});
2626
```
2727

28-
You can replace the `openai` models with any other provider of your choice. You will need to install the provider library and switch the models accordingly.
28+
You can replace the models with any other provider of your choice. You will need to install the provider library and switch the models accordingly.
2929

30-
For example, if you want to use Anthropic's `claude-3-5-sonnet` model for `chat-model-large`, you can replace the `openai` model with the `anthropic` model as shown below.
30+
For example, if you want to use Anthropic's `claude-3-5-sonnet` model for `chat-model`, you can replace the `xai` model with the `anthropic` model as shown below.
3131

3232
```ts
3333
import { customProvider } from "ai";
3434
import { anthropic } from "@ai-sdk/anthropic";
3535

3636
export const myProvider = customProvider({
3737
languageModels: {
38-
"chat-model-small": openai("gpt-4o-mini"),
39-
"chat-model-large": anthropic("claude-3-5-sonnet"), // Replace openai with anthropic
38+
"chat-model": anthropic("claude-3-5-sonnet"), // Replace xai with anthropic
4039
"chat-model-reasoning": wrapLanguageModel({
4140
model: fireworks("accounts/fireworks/models/deepseek-r1"),
4241
middleware: extractReasoningMiddleware({ tagName: "think" }),

lib/ai/models.ts

+4-9
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
export const DEFAULT_CHAT_MODEL: string = 'chat-model-small';
1+
export const DEFAULT_CHAT_MODEL: string = 'chat-model';
22

33
interface ChatModel {
44
id: string;
@@ -8,14 +8,9 @@ interface ChatModel {
88

99
export const chatModels: Array<ChatModel> = [
1010
{
11-
id: 'chat-model-small',
12-
name: 'Small model',
13-
description: 'Small model for fast, lightweight tasks',
14-
},
15-
{
16-
id: 'chat-model-large',
17-
name: 'Large model',
18-
description: 'Large model for complex, multi-step tasks',
11+
id: 'chat-model',
12+
name: 'Chat model',
13+
description: 'Primary model for all-purpose chat',
1914
},
2015
{
2116
id: 'chat-model-reasoning',

lib/ai/providers.ts

+5-6
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@ import {
55
} from 'ai';
66
import { openai } from '@ai-sdk/openai';
77
import { fireworks } from '@ai-sdk/fireworks';
8+
import { xai } from '@ai-sdk/xai';
89
import { isTestEnvironment } from '../constants';
910
import {
1011
artifactModel,
@@ -16,23 +17,21 @@ import {
1617
export const myProvider = isTestEnvironment
1718
? customProvider({
1819
languageModels: {
19-
'chat-model-small': chatModel,
20-
'chat-model-large': chatModel,
20+
'chat-model': chatModel,
2121
'chat-model-reasoning': reasoningModel,
2222
'title-model': titleModel,
2323
'artifact-model': artifactModel,
2424
},
2525
})
2626
: customProvider({
2727
languageModels: {
28-
'chat-model-small': openai('gpt-4o-mini'),
29-
'chat-model-large': openai('gpt-4o'),
28+
'chat-model': xai('grok-2-1212'),
3029
'chat-model-reasoning': wrapLanguageModel({
3130
model: fireworks('accounts/fireworks/models/deepseek-r1'),
3231
middleware: extractReasoningMiddleware({ tagName: 'think' }),
3332
}),
34-
'title-model': openai('gpt-4-turbo'),
35-
'artifact-model': openai('gpt-4o-mini'),
33+
'title-model': xai('grok-2-1212'),
34+
'artifact-model': xai('grok-2-1212'),
3635
},
3736
imageModels: {
3837
'small-model': openai.image('dall-e-2'),

package.json

+3-2
Original file line numberDiff line numberDiff line change
@@ -19,9 +19,10 @@
1919
"test": "export PLAYWRIGHT=True && pnpm exec playwright test --workers=4"
2020
},
2121
"dependencies": {
22-
"@ai-sdk/fireworks": "0.1.16",
23-
"@ai-sdk/openai": "1.2.5",
22+
"@ai-sdk/fireworks": "^0.1.16",
23+
"@ai-sdk/openai": "^1.2.5",
2424
"@ai-sdk/react": "^1.1.23",
25+
"@ai-sdk/xai": "^1.1.15",
2526
"@codemirror/lang-javascript": "^6.2.2",
2627
"@codemirror/lang-python": "^6.1.6",
2728
"@codemirror/state": "^6.5.0",

0 commit comments

Comments
 (0)