Clarify Agents SDK is LLM-agnostic#28687
Conversation
Rewrites the 'Using AI Models' reference page to lead with provider agnosticism and show three clear approaches (AI SDK, Workers AI binding, BYO client library) with concrete examples for OpenAI, Anthropic, Google Gemini, and more. Updates the concepts page and landing page to reinforce the same message throughout.
|
This pull request requires reviews from CODEOWNERS as it changes files that match the following patterns:
|
|
/bonk thoroughly review this:
Fix any issues from the review and then self-review against this criteria again. |
… types - Update Anthropic model from claude-sonnet-4-20250514 to claude-sonnet-4-5 - Add Connection/WSMessage type imports to streaming examples - Add HTTP SSE streaming example with toTextStreamResponse() - Add Env interface and generic type param to all AIChatAgent examples - Expand streaming section with prominent callout and sub-sections
|
All changes committed. Here is a summary of the review and fixes applied: Self-Review SummaryCriterion 1: Model RecencyIssue found: All Anthropic model references used Fix applied: Updated all 4 instances across Criterion 2: Agent Class Definitions in Code ExamplesIssues found:
Fixes applied:
Criterion 3: Streaming Best PracticesIssues found:
Fixes applied:
Validation
|
|
|
||
| const result = streamText({ | ||
| model: workersai("@cf/zai-org/glm-4.7-flash"), | ||
| model: openai("gpt-4.1"), |
There was a problem hiding this comment.
/bonk this is not current at all. gpt-5.3 is.
look up OpenAI's model listings first and then review.
Same for Gemini.
…-sonnet-4-6, gemini-3-flash)
|
All model references have been updated to current versions. Here's a summary of what was verified and changed: Research findingsLooked up the current model listings for each provider:
Changes applied (14 replacements across 2 files)
Validation
|
| export class MyAgent extends Agent<Env> { | ||
| async onMessage(connection: Connection, message: WSMessage) { | ||
| const { prompt } = JSON.parse(message as string); | ||
| const workersai = createWorkersAI({ binding: this.env.AI }); |
There was a problem hiding this comment.
/bonk this example is now worse. Revert this: still use Workers AI for core examples like this.
|
Committed. The "Surviving disconnections" example at line 63 now uses |
All changes are committed. Here is a summary of what was done:
Summary of changes
Problem
The Agents SDK documentation did not make it clear that it is AI provider and model agnostic. Users were confused into thinking the SDK was tied to Cloudflare's own AI services.
Changes across 3 files
src/content/docs/agents/api-reference/using-ai-models.mdx(major rewrite)gpt-4.1), Anthropic (claude-sonnet-4-20250514), Google Gemini (gemini-2.5-flash), plus a list of 8+ other supported providers (Mistral, Bedrock, Azure, Groq, xAI, Cohere, Fireworks, DeepSeek).dev.varssrc/content/docs/agents/concepts/calling-llms.mdx(targeted updates)src/content/docs/agents/index.mdx(landing page)Validation
npm run check: 0 errorsnpm run lint: cleannpm run format:core:check: cleanCloses #28686
github run