Add an optional AI SDK adapter so users can plug any of AI SDK's 60+ providers into OMA agents.
Design
- New
AISdkAdapter class implementing LLMAdapter
- New optional
adapter field on AgentConfig -- when set, skip createAdapter() factory
ai as optional peer dependency (same pattern as @google/genai)
- Zero breaking changes, existing
provider: 'anthropic' / provider: 'openai' unaffected
User-facing API
import { openai } from '@ai-sdk/openai'
import { AISdkAdapter, OpenMultiAgent } from 'open-multi-agent'
const oma = new OpenMultiAgent()
await oma.runAgent(
{
name: 'researcher',
model: 'gpt-4o',
adapter: new AISdkAdapter(openai('gpt-4o')),
systemPrompt: 'You are a researcher.',
},
'What are the latest AI trends?'
)
Mixed teams work naturally -- some agents use AI SDK providers, others use native adapters.
Scope
Context
Raised in #25 by @chocofoxy. Instead of manually verifying OpenAI-compatible providers one by one, this adapter gives instant access to all AI SDK providers (OpenAI, Anthropic, Google, DeepSeek, Mistral, Groq, Qwen, Moonshot, and 50+ more).
Add an optional AI SDK adapter so users can plug any of AI SDK's 60+ providers into OMA agents.
Design
AISdkAdapterclass implementingLLMAdapteradapterfield onAgentConfig-- when set, skipcreateAdapter()factoryaias optional peer dependency (same pattern as@google/genai)provider: 'anthropic'/provider: 'openai'unaffectedUser-facing API
Mixed teams work naturally -- some agents use AI SDK providers, others use native adapters.
Scope
src/llm/ai-sdk.ts-- AISdkAdapter (chat + stream)adapter?field on AgentConfig/CoordinatorConfigContext
Raised in #25 by @chocofoxy. Instead of manually verifying OpenAI-compatible providers one by one, this adapter gives instant access to all AI SDK providers (OpenAI, Anthropic, Google, DeepSeek, Mistral, Groq, Qwen, Moonshot, and 50+ more).