A minimal and opinionated Anthropic API client built on top of ai-fetch.
anthropic-fetch provides a streamlined interface for interacting with Anthropic's AI models, leveraging the consistent and minimal approach of ai-fetch.
- Fast and small client that doesn't patch fetch
- Supports all environments with native fetch: Node 18+, browsers, Deno, Cloudflare Workers, etc
- Consistent interface aligned with other
ai-fetchderived clients - Focused on chat completions and embeddings for Anthropic models
npm install anthropic-fetchThis package requires node >= 18 or an environment with fetch support.
This package exports ESM. If your project uses CommonJS, consider switching to ESM or use the dynamic import() function.
import { AnthropicClient } from 'anthropic-fetch';
const client = new AnthropicClient({
apiKey: 'your-api-key-here',
});
// Generate a chat completion
const response = await client.createChatCompletion({
model: 'claude-3-opus-20240229',
messages: [{ role: 'user', content: 'Hello, Claude!' }],
});
console.log(response.choices[0].message.content);The apiKey is optional and will be read from process.env.OPENAI_API_KEY if present.
The Anthropic Fetch API implements the following ai-fetch interfaces
// Generate a single chat completion
client.createChatCompletion(params: ChatParams): Promise<ChatResponse>;
// Stream a single completion via a ReadableStream
client.streamChatCompletion(params: ChatStreamParams): Promise<ChatStreamResponse>;The type definitions are available through TSServer, and can be found in the source code.
anthropic-fetch is built on top of ai-fetch
MIT © Dexa