Skip to content

Commit

Permalink
Merge pull request #74 from arshad-yaseen/add-claude-3.5-haiku
Browse files Browse the repository at this point in the history
Add claude-3-5-haiku
  • Loading branch information
arshad-yaseen authored Nov 4, 2024
2 parents c9b4e0d + 9189196 commit 196ae84
Show file tree
Hide file tree
Showing 17 changed files with 68 additions and 70 deletions.
6 changes: 3 additions & 3 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -136,13 +136,13 @@ To test Monacopilot locally, follow these steps:

1. **Set Up Environment Variables**

You need to set the [Groq API key](https://console.groq.com/keys) as an environment variable. Create a `.env.local` file in the `tests/ui` directory with the following content:
You need to set the [OpenAI API key](https://platform.openai.com/api-keys) as an environment variable. Create a `.env.local` file in the `tests/ui` directory with the following content:

```plaintext
GROQ_API_KEY=your_api_key
OPENAI_API_KEY=your_api_key
```

Replace `your_api_key` with your actual Groq API key.
Replace `your_api_key` with your actual OpenAI API key.

2. **Run the Test UI**

Expand Down
35 changes: 16 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,9 +72,9 @@ import {Copilot} from 'monacopilot';

const app = express();
const port = process.env.PORT || 3000;
const copilot = new Copilot(process.env.GROQ_API_KEY!, {
provider: 'groq',
model: 'llama-3-70b',
const copilot = new Copilot(process.env.ANTHROPIC_API_KEY!, {
provider: 'anthropic',
model: 'claude-3-5-haiku',
});

app.use(express.json());
Expand All @@ -87,7 +87,7 @@ app.post('/complete', async (req, res) => {
// Process raw LLM response if needed
// `raw` can be undefined if an error occurred, which happens when `error` is present
if (raw) {
calculateCost(raw.usage.total_tokens);
calculateCost(raw.usage.input_tokens);
}

// Handle errors if present
Expand Down Expand Up @@ -145,9 +145,6 @@ registerCompletion(monaco, editor, {
endpoint: 'https://api.example.com/complete',
// The language of the editor.
language: 'javascript',
// If you are using Groq as your provider, it's recommended to set `maxContextLines` to `60` or less.
// This is because Groq has low rate limits and doesn't offer pay-as-you-go pricing.
maxContextLines: 60,
});
```

Expand All @@ -168,11 +165,11 @@ registerCompletion(monaco, editor, {
});
```

| Trigger | Description | Notes |
| -------------------- | --------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `'onIdle'` (default) | Provides completions after a brief pause in typing. | This approach is less resource-intensive, as it only initiates a request when the editor is idle. |
| `'onTyping'` | Provides completions in real-time as you type. | Best suited for models with low response latency, such as Groq models. This trigger mode initiates additional background requests to deliver real-time suggestions, a method known as predictive caching. |
| `'onDemand'` | Does not provide completions automatically. | Completions are triggered manually using the `trigger` function from the `registerCompletion` return. This allows for precise control over when completions are provided. |
| Trigger | Description | Notes |
| -------------------- | --------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `'onIdle'` (default) | Provides completions after a brief pause in typing. | This approach is less resource-intensive, as it only initiates a request when the editor is idle. |
| `'onTyping'` | Provides completions in real-time as you type. | Best suited for models with low response latency, such as Groq models or Claude 3-5 Haiku. This trigger mode initiates additional background requests to deliver real-time suggestions, a method known as predictive caching. |
| `'onDemand'` | Does not provide completions automatically. | Completions are triggered manually using the `trigger` function from the `registerCompletion` return. This allows for precise control over when completions are provided. |

[OnTyping Demo](https://github.com/user-attachments/assets/22c2ce44-334c-4963-b853-01b890b8e39f)

Expand Down Expand Up @@ -416,17 +413,17 @@ const copilot = new Copilot(process.env.OPENAI_API_KEY, {
});
```

The default provider is `groq`, and the default model is `llama-3-70b`.
The default provider is `anthropic`, and the default model is `claude-3-5-haiku`.

> **Tip:** Even though the default provider and model are `groq` and `llama-3-70b`, it's always recommended to specify a provider and model when using Monacopilot. This ensures your code remains consistent even if the default settings change in future updates.
> **Tip:** Even though the default provider and model are `anthropic` and `claude-3-5-haiku`, it's always recommended to specify a provider and model when using Monacopilot. This ensures your code remains consistent even if the default settings change in future updates.
There are other providers and models available. Here is a list:

| Provider | Models |
| --------- | ------------------------------------------------------------------------- |
| Groq | `llama-3-70b` |
| OpenAI | `gpt-4o`, `gpt-4o-mini`, `o1-preview`, `o1-mini` |
| Anthropic | `claude-3-5-sonnet`, `claude-3-opus`, `claude-3-sonnet`, `claude-3-haiku` |
| Provider | Models |
| --------- | --------------------------------------------------------- |
| Groq | `llama-3-70b` |
| OpenAI | `gpt-4o`, `gpt-4o-mini`, `o1-preview`, `o1-mini` |
| Anthropic | `claude-3-5-sonnet`, `claude-3-haiku`, `claude-3-5-haiku` |

### Custom Model

Expand Down
2 changes: 1 addition & 1 deletion examples/nextjs/app/.env.example
Original file line number Diff line number Diff line change
@@ -1 +1 @@
GROQ_API_KEY=
ANTHROPIC_API_KEY=
2 changes: 1 addition & 1 deletion examples/nextjs/app/app/api/complete/route.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ import {NextRequest, NextResponse} from 'next/server';

import {Copilot, type CompletionRequestBody} from 'monacopilot';

const copilot = new Copilot(process.env.GROQ_API_KEY!);
const copilot = new Copilot(process.env.ANTHROPIC_API_KEY!);

export async function POST(req: NextRequest) {
const body: CompletionRequestBody = await req.json();
Expand Down
2 changes: 1 addition & 1 deletion examples/nextjs/pages/.env.example
Original file line number Diff line number Diff line change
@@ -1 +1 @@
GROQ_API_KEY=
ANTHROPIC_API_KEY=
2 changes: 1 addition & 1 deletion examples/nextjs/pages/pages/api/complete.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ import {NextApiRequest, NextApiResponse} from 'next';

import {Copilot} from 'monacopilot';

const copilot = new Copilot(process.env.GROQ_API_KEY!);
const copilot = new Copilot(process.env.ANTHROPIC_API_KEY!);

export default async function handler(
req: NextApiRequest,
Expand Down
2 changes: 1 addition & 1 deletion examples/remix/.env.example
Original file line number Diff line number Diff line change
@@ -1 +1 @@
GROQ_API_KEY=
ANTHROPIC_API_KEY=
2 changes: 1 addition & 1 deletion examples/remix/app/routes/complete.tsx
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import {json, type ActionFunctionArgs} from '@remix-run/node';
import {Copilot, type CompletionRequestBody} from 'monacopilot';

const copilot = new Copilot(process.env.GROQ_API_KEY!);
const copilot = new Copilot(process.env.ANTHROPIC_API_KEY!);

export const action = async ({request}: ActionFunctionArgs) => {
const body: CompletionRequestBody = await request.json();
Expand Down
14 changes: 4 additions & 10 deletions src/constants/copilot.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,8 @@ export const COPILOT_MODEL_IDS: Record<CopilotModel, string> = {
'gpt-4o': 'gpt-4o-2024-08-06',
'gpt-4o-mini': 'gpt-4o-mini',
'claude-3-5-sonnet': 'claude-3-5-sonnet-20241022',
'claude-3-opus': 'claude-3-opus-20240229',
'claude-3-sonnet': 'claude-3-sonnet-20240229',
'claude-3-haiku': 'claude-3-haiku-20240307',
'claude-3-5-haiku': 'claude-3-5-haiku-20241022',
'o1-preview': 'o1-preview',
'o1-mini': 'o1-mini',
} as const;
Expand All @@ -20,16 +19,11 @@ export const COPILOT_PROVIDER_MODEL_MAP: Record<
> = {
groq: ['llama-3-70b'],
openai: ['gpt-4o', 'gpt-4o-mini', 'o1-preview', 'o1-mini'],
anthropic: [
'claude-3-5-sonnet',
'claude-3-opus',
'claude-3-haiku',
'claude-3-sonnet',
],
anthropic: ['claude-3-5-sonnet', 'claude-3-haiku', 'claude-3-5-haiku'],
} as const;

export const DEFAULT_COPILOT_MODEL: CopilotModel = 'llama-3-70b' as const;
export const DEFAULT_COPILOT_PROVIDER: CopilotProvider = 'groq' as const;
export const DEFAULT_COPILOT_PROVIDER: CopilotProvider = 'anthropic' as const;
export const DEFAULT_COPILOT_MODEL: CopilotModel = 'claude-3-5-haiku' as const;

export const COPILOT_PROVIDER_ENDPOINT_MAP: Record<CopilotProvider, string> = {
groq: 'https://api.groq.com/openai/v1/chat/completions',
Expand Down
3 changes: 1 addition & 2 deletions src/constants/provider/anthropic.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@ import {AnthropicModel} from '../../types';

export const MAX_TOKENS_BY_ANTHROPIC_MODEL: Record<AnthropicModel, number> = {
'claude-3-5-sonnet': 8192,
'claude-3-opus': 4096,
'claude-3-5-haiku': 8192,
'claude-3-haiku': 4096,
'claude-3-sonnet': 4096,
} as const;
16 changes: 14 additions & 2 deletions src/helpers/provider.ts
Original file line number Diff line number Diff line change
Expand Up @@ -83,10 +83,22 @@ const anthropicHandler: ProviderHandler<'anthropic'> = {
}),

parseCompletion: completion => {
if (!completion.content || typeof completion.content !== 'string') {
if (
!completion.content ||
!Array.isArray(completion.content) ||
!completion.content.length
) {
return null;
}
return completion.content;

const content = completion.content[0];
if (!content || typeof content !== 'object') {
return null;
}

return 'text' in content && typeof content.text === 'string'
? content.text
: null;
},
};

Expand Down
7 changes: 3 additions & 4 deletions src/types/copilot.ts
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,8 @@ export type OpenAIModel = 'gpt-4o' | 'gpt-4o-mini' | 'o1-preview' | 'o1-mini';
export type GroqModel = 'llama-3-70b';
export type AnthropicModel =
| 'claude-3-5-sonnet'
| 'claude-3-opus'
| 'claude-3-haiku'
| 'claude-3-sonnet';
| 'claude-3-5-haiku'
| 'claude-3-haiku';

export type CopilotModel = OpenAIModel | GroqModel | AnthropicModel;

Expand Down Expand Up @@ -86,7 +85,7 @@ export interface CopilotOptions {
/**
* The model to use for copilot LLM requests.
* This can be either:
* 1. A predefined model name (e.g. 'claude-3-opus'): Use this option if you want to use a model that is built into Monacopilot.
* 1. A predefined model name (e.g. 'claude-3-5-haiku'): Use this option if you want to use a model that is built into Monacopilot.
* If you choose this option, also set the `provider` property to the corresponding provider of the model.
* 2. A custom model configuration object: Use this option if you want to use a LLM from a third-party service or your own custom model.
*
Expand Down
21 changes: 7 additions & 14 deletions tests/completion.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -48,9 +48,9 @@ describe('Completion', () => {
expect.objectContaining({
model: COPILOT_MODEL_IDS[DEFAULT_COPILOT_MODEL],
messages: expect.arrayContaining([
{role: 'system', content: expect.any(String)},
{role: 'user', content: expect.any(String)},
]),
system: expect.any(String),
temperature: DEFAULT_COPILOT_TEMPERATURE,
}),
expect.objectContaining({
Expand Down Expand Up @@ -154,12 +154,12 @@ describe('Completion', () => {
expect.any(String),
expect.objectContaining({
messages: [
{role: 'system', content: 'Custom system prompt'},
{
role: 'user',
content: expect.stringContaining('Custom user prompt'),
},
],
system: 'Custom system prompt',
}),
expect.any(Object),
);
Expand All @@ -177,10 +177,8 @@ describe('Completion', () => {
expect(HTTP.POST).toHaveBeenCalledWith(
expect.any(String),
expect.objectContaining({
messages: [
{role: 'system', content: expect.any(String)},
{role: 'user', content: expect.any(String)},
],
messages: [{role: 'user', content: expect.any(String)}],
system: expect.any(String),
}),
expect.any(Object),
);
Expand All @@ -204,14 +202,9 @@ describe('Completion', () => {
expect(HTTP.POST).toHaveBeenCalledWith(
expect.any(String),
expect.objectContaining({
messages: [
{
role: 'system',
content:
'You are an AI assistant specialized in writing React components.',
},
{role: 'user', content: expect.any(String)},
],
messages: [{role: 'user', content: expect.any(String)}],
system:
'You are an AI assistant specialized in writing React components.',
}),
expect.any(Object),
);
Expand Down
8 changes: 5 additions & 3 deletions tests/copilot-custom-model.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ describe('Copilot with model', () => {
});

expect(HTTP.POST).toHaveBeenCalledWith(
expect.stringContaining('api.groq.com'),
expect.stringContaining('api.anthropic.com'),
expect.any(Object),
expect.any(Object),
);
Expand Down Expand Up @@ -122,12 +122,14 @@ describe('Copilot with model', () => {
});

expect(HTTP.POST).toHaveBeenCalledWith(
expect.stringContaining('api.groq.com'),
expect.stringContaining('api.anthropic.com'),
expect.any(Object),
expect.objectContaining({
headers: expect.objectContaining({
'X-Custom-Header': 'custom-value',
Authorization: expect.stringContaining('Bearer'),
'Content-Type': 'application/json',
'x-api-key': expect.any(String),
'anthropic-version': '2023-06-01',
}),
}),
);
Expand Down
4 changes: 2 additions & 2 deletions tests/mock.ts
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,10 @@ export const mockCompletionMetadata: CompletionMetadata = {
},
};

export const MOCK_COMPLETION_CONTENT = 'Test completion';
export const MOCK_COMPLETION_CONTENT = "Hi, I'm Claude.";

export const mockCompletion = {
choices: [{message: {content: MOCK_COMPLETION_CONTENT}}],
content: [{type: 'text', text: MOCK_COMPLETION_CONTENT}],
};

export const mockError = new Error('API Error');
Expand Down
10 changes: 6 additions & 4 deletions tests/provider-handlers.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -26,12 +26,14 @@ describe('Provider Handler Functions', () => {
});

it('should call the correct handler for Anthropic', () => {
const mockCompletion = {content: 'Anthropic response'};
const mockCompletion = {
content: [{type: 'text', text: "Hi, I'm Claude."}],
};
const result = parseProviderChatCompletion(
mockCompletion as unknown as ChatCompletion,
'anthropic',
);
expect(result).toEqual('Anthropic response');
expect(result).toEqual("Hi, I'm Claude.");
});
});

Expand Down Expand Up @@ -86,10 +88,10 @@ describe('Provider Handler Functions', () => {
describe('parseAnthropicCompletion', () => {
it('should parse a valid Anthropic completion', () => {
const mockCompletion = {
content: 'Anthropic response',
content: [{type: 'text', text: "Hi, I'm Claude."}],
} as unknown as ChatCompletion;
const result = parseProviderChatCompletion(mockCompletion, 'anthropic');
expect(result).toEqual('Anthropic response');
expect(result).toEqual("Hi, I'm Claude.");
});

it('should handle missing content', () => {
Expand Down
2 changes: 1 addition & 1 deletion tests/ui/.env.example
Original file line number Diff line number Diff line change
@@ -1 +1 @@
GROQ_API_KEY=your_api_key
OPENAI_API_KEY=your_api_key

0 comments on commit 196ae84

Please sign in to comment.