Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

package updates #871

Merged
merged 6 commits into from
Mar 18, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
# Get your xAI API Key here for chat models: https://console.x.ai/
XAI_API_KEY=****

# Get your OpenAI API Key here for chat models: https://platform.openai.com/account/api-keys
OPENAI_API_KEY=****

Expand Down
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
- [AI SDK](https://sdk.vercel.ai/docs)
- Unified API for generating text, structured objects, and tool calls with LLMs
- Hooks for building dynamic chat and generative user interfaces
- Supports OpenAI (default), Anthropic, Cohere, and other model providers
- Supports xAI (default), OpenAI, Fireworks, and other model providers
- [shadcn/ui](https://ui.shadcn.com)
- Styling with [Tailwind CSS](https://tailwindcss.com)
- Component primitives from [Radix UI](https://radix-ui.com) for accessibility and flexibility
Expand All @@ -35,19 +35,19 @@

## Model Providers

This template ships with OpenAI `gpt-4o` as the default. However, with the [AI SDK](https://sdk.vercel.ai/docs), you can switch LLM providers to [OpenAI](https://openai.com), [Anthropic](https://anthropic.com), [Cohere](https://cohere.com/), and [many more](https://sdk.vercel.ai/providers/ai-sdk-providers) with just a few lines of code.
This template ships with [xAI](https://x.ai) `grok-2-1212` as the default chat model. However, with the [AI SDK](https://sdk.vercel.ai/docs), you can switch LLM providers to [OpenAI](https://openai.com), [Anthropic](https://anthropic.com), [Cohere](https://cohere.com/), and [many more](https://sdk.vercel.ai/providers/ai-sdk-providers) with just a few lines of code.

## Deploy Your Own

You can deploy your own version of the Next.js AI Chatbot to Vercel with one click:

[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot&env=AUTH_SECRET,OPENAI_API_KEY&envDescription=Learn%20more%20about%20how%20to%20get%20the%20API%20Keys%20for%20the%20application&envLink=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&demo-title=AI%20Chatbot&demo-description=An%20Open-Source%20AI%20Chatbot%20Template%20Built%20With%20Next.js%20and%20the%20AI%20SDK%20by%20Vercel.&demo-url=https%3A%2F%2Fchat.vercel.ai&stores=[{%22type%22:%22postgres%22},{%22type%22:%22blob%22}])
[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot&env=AUTH_SECRET,OPENAI_API_KEY,XAI_API_KEY,FIREWORKS_API_KEY&envDescription=Learn%20more%20about%20how%20to%20get%20the%20API%20Keys%20for%20the%20application&envLink=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&demo-title=AI%20Chatbot&demo-description=An%20Open-Source%20AI%20Chatbot%20Template%20Built%20With%20Next.js%20and%20the%20AI%20SDK%20by%20Vercel.&demo-url=https%3A%2F%2Fchat.vercel.ai&stores=[{%22type%22:%22postgres%22},{%22type%22:%22blob%22}])

## Running locally

You will need to use the environment variables [defined in `.env.example`](.env.example) to run Next.js AI Chatbot. It's recommended you use [Vercel Environment Variables](https://vercel.com/docs/projects/environment-variables) for this, but a `.env` file is all that is necessary.

> Note: You should not commit your `.env` file or it will expose secrets that will allow others to control access to your various OpenAI and authentication provider accounts.
> Note: You should not commit your `.env` file or it will expose secrets that will allow others to control access to your various AI and authentication provider accounts.

1. Install Vercel CLI: `npm i -g vercel`
2. Link local instance with Vercel and GitHub accounts (creates `.vercel` directory): `vercel link`
Expand Down
2 changes: 1 addition & 1 deletion components/chat-header.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ function PureChatHeader({
asChild
>
<Link
href="https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot&env=AUTH_SECRET,OPENAI_API_KEY&envDescription=Learn%20more%20about%20how%20to%20get%20the%20API%20Keys%20for%20the%20application&envLink=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&demo-title=AI%20Chatbot&demo-description=An%20Open-Source%20AI%20Chatbot%20Template%20Built%20With%20Next.js%20and%20the%20AI%20SDK%20by%20Vercel.&demo-url=https%3A%2F%2Fchat.vercel.ai&stores=%5B%7B%22type%22:%22postgres%22%7D,%7B%22type%22:%22blob%22%7D%5D"
href="https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot&env=AUTH_SECRET,OPENAI_API_KEY,XAI_API_KEY,FIREWORKS_API_KEY&envDescription=Learn%20more%20about%20how%20to%20get%20the%20API%20Keys%20for%20the%20application&envLink=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&demo-title=AI%20Chatbot&demo-description=An%20Open-Source%20AI%20Chatbot%20Template%20Built%20With%20Next.js%20and%20the%20AI%20SDK%20by%20Vercel.&demo-url=https%3A%2F%2Fchat.vercel.ai&stores=%5B%7B%22type%22:%22postgres%22%7D,%7B%22type%22:%22blob%22%7D%5D"
target="_noblank"
>
<VercelIcon size={16} />
Expand Down
7 changes: 5 additions & 2 deletions docs/01-quick-start.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,11 +8,14 @@ Deploying to [Vercel](https://vercel.com) is the quickest way to get started wit

- Vercel account and [Vercel CLI](https://vercel.com/docs/cli)
- GitHub/GitLab/Bitbucket account
- API Key from [OpenAI](https://platform.openai.com)
- API Keys from three AI model providers:
- [xAI](https://console.x.ai/)
- [OpenAI](https://platform.openai.com/account/api-keys)
- [Fireworks](https://fireworks.ai/account/api-keys)

### Deploy to Vercel

To deploy the chatbot template to Vercel, click this [link](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot&env=AUTH_SECRET,OPENAI_API_KEY&envDescription=Learn%20more%20about%20how%20to%20get%20the%20API%20Keys%20for%20the%20application&envLink=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&demo-title=AI%20Chatbot&demo-description=An%20Open-Source%20AI%20Chatbot%20Template%20Built%20With%20Next.js%20and%20the%20AI%20SDK%20by%20Vercel.&demo-url=https%3A%2F%2Fchat.vercel.ai&stores=%5B%7B%22type%22:%22postgres%22%7D,%7B%22type%22:%22blob%22%7D%5D) to enter the 1-click deploy flow.
To deploy the chatbot template to Vercel, click this [link](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot&env=AUTH_SECRET,OPENAI_API_KEY,XAI_API_KEY,FIREWORKS_API_KEY&envDescription=Learn%20more%20about%20how%20to%20get%20the%20API%20Keys%20for%20the%20application&envLink=https%3A%2F%2Fgithub.com%2Fvercel%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&demo-title=AI%20Chatbot&demo-description=An%20Open-Source%20AI%20Chatbot%20Template%20Built%20With%20Next.js%20and%20the%20AI%20SDK%20by%20Vercel.&demo-url=https%3A%2F%2Fchat.vercel.ai&stores=%5B%7B%22type%22:%22postgres%22%7D,%7B%22type%22:%22blob%22%7D%5D) to enter the 1-click deploy flow.

During the flow, you will be prompted to create and connect to a postgres database and blob store. You will also need to provide environment variables for the application.

Expand Down
21 changes: 10 additions & 11 deletions docs/02-update-models.md
Original file line number Diff line number Diff line change
@@ -1,42 +1,41 @@
# Update Models

The chatbot template ships with [OpenAI](https://sdk.vercel.ai/providers/ai-sdk-providers/openai) as the default model provider. Since the template is powered by the [AI SDK](https://sdk.vercel.ai), which supports [multiple providers](https://sdk.vercel.ai/providers/ai-sdk-providers) out of the box, you can easily switch to another provider of your choice.
The chatbot template ships with [xAI](https://sdk.vercel.ai/providers/ai-sdk-providers/xai) as the default model provider. Since the template is powered by the [AI SDK](https://sdk.vercel.ai), which supports [multiple providers](https://sdk.vercel.ai/providers/ai-sdk-providers) out of the box, you can easily switch to another provider of your choice.

To update the models, you will need to update the custom provider called `myProvider` at `/lib/ai/models.ts` shown below.

```ts
import { customProvider } from "ai";
import { openai } from "@ai-sdk/openai";
import { xai } from "@ai-sdk/xai";

export const myProvider = customProvider({
languageModels: {
"chat-model-small": openai("gpt-4o-mini"),
"chat-model-large": openai("gpt-4o"),
"chat-model": xai("grok-2-1212"),
"chat-model-reasoning": wrapLanguageModel({
model: fireworks("accounts/fireworks/models/deepseek-r1"),
middleware: extractReasoningMiddleware({ tagName: "think" }),
}),
"title-model": openai("gpt-4-turbo"),
"artifact-model": openai("gpt-4o-mini"),
"title-model": xai("grok-2-1212"),
"artifact-model": xai("grok-2-1212"),
},
imageModels: {
"small-model": openai.image("dall-e-3"),
"small-model": openai.image("dall-e-2"),
"large-model": openai.image("dall-e-3"),
},
});
```

You can replace the `openai` models with any other provider of your choice. You will need to install the provider library and switch the models accordingly.
You can replace the models with any other provider of your choice. You will need to install the provider library and switch the models accordingly.

For example, if you want to use Anthropic's `claude-3-5-sonnet` model for `chat-model-large`, you can replace the `openai` model with the `anthropic` model as shown below.
For example, if you want to use Anthropic's `claude-3-5-sonnet` model for `chat-model`, you can replace the `xai` model with the `anthropic` model as shown below.

```ts
import { customProvider } from "ai";
import { anthropic } from "@ai-sdk/anthropic";

export const myProvider = customProvider({
languageModels: {
"chat-model-small": openai("gpt-4o-mini"),
"chat-model-large": anthropic("claude-3-5-sonnet"), // Replace openai with anthropic
"chat-model": anthropic("claude-3-5-sonnet"), // Replace xai with anthropic
"chat-model-reasoning": wrapLanguageModel({
model: fireworks("accounts/fireworks/models/deepseek-r1"),
middleware: extractReasoningMiddleware({ tagName: "think" }),
Expand Down
13 changes: 4 additions & 9 deletions lib/ai/models.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
export const DEFAULT_CHAT_MODEL: string = 'chat-model-small';
export const DEFAULT_CHAT_MODEL: string = 'chat-model';

interface ChatModel {
id: string;
Expand All @@ -8,14 +8,9 @@ interface ChatModel {

export const chatModels: Array<ChatModel> = [
{
id: 'chat-model-small',
name: 'Small model',
description: 'Small model for fast, lightweight tasks',
},
{
id: 'chat-model-large',
name: 'Large model',
description: 'Large model for complex, multi-step tasks',
id: 'chat-model',
name: 'Chat model',
description: 'Primary model for all-purpose chat',
},
{
id: 'chat-model-reasoning',
Expand Down
11 changes: 5 additions & 6 deletions lib/ai/providers.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ import {
} from 'ai';
import { openai } from '@ai-sdk/openai';
import { fireworks } from '@ai-sdk/fireworks';
import { xai } from '@ai-sdk/xai';
import { isTestEnvironment } from '../constants';
import {
artifactModel,
Expand All @@ -16,23 +17,21 @@ import {
export const myProvider = isTestEnvironment
? customProvider({
languageModels: {
'chat-model-small': chatModel,
'chat-model-large': chatModel,
'chat-model': chatModel,
'chat-model-reasoning': reasoningModel,
'title-model': titleModel,
'artifact-model': artifactModel,
},
})
: customProvider({
languageModels: {
'chat-model-small': openai('gpt-4o-mini'),
'chat-model-large': openai('gpt-4o'),
'chat-model': xai('grok-2-1212'),
'chat-model-reasoning': wrapLanguageModel({
model: fireworks('accounts/fireworks/models/deepseek-r1'),
middleware: extractReasoningMiddleware({ tagName: 'think' }),
}),
'title-model': openai('gpt-4-turbo'),
'artifact-model': openai('gpt-4o-mini'),
'title-model': xai('grok-2-1212'),
'artifact-model': xai('grok-2-1212'),
},
imageModels: {
'small-model': openai.image('dall-e-2'),
Expand Down
5 changes: 3 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,10 @@
"test": "export PLAYWRIGHT=True && pnpm exec playwright test --workers=4"
},
"dependencies": {
"@ai-sdk/fireworks": "0.1.16",
"@ai-sdk/openai": "1.2.5",
"@ai-sdk/fireworks": "^0.1.16",
"@ai-sdk/openai": "^1.2.5",
"@ai-sdk/react": "^1.1.23",
"@ai-sdk/xai": "^1.1.15",
"@codemirror/lang-javascript": "^6.2.2",
"@codemirror/lang-python": "^6.1.6",
"@codemirror/state": "^6.5.0",
Expand Down
Loading
Loading