Skip to content

Conversation

@dsfaccini
Copy link
Collaborator

@dsfaccini dsfaccini commented Jan 7, 2026

result.mp4

@dsfaccini dsfaccini added the docs Improvements or additions to documentation label Jan 7, 2026
@github-actions
Copy link

github-actions bot commented Jan 7, 2026

Docs Preview

commit: 10bb20f
Preview URL: https://11a328f1-pydantic-ai-previews.pydantic.workers.dev

Copy link
Collaborator

@DouweM DouweM left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this PR is worth splitting up into 2 PRs because it looks like the page reorg is closer to ready than the popular model catalog

@@ -0,0 +1,55 @@
# Alibaba Cloud Model Studio (DashScope)

Alibaba Cloud Model Studio (DashScope) provides access to Qwen models via an OpenAI-compatible API.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we're going to have this initial paragraph (note that OpenAI, Anthropic, etc do not), we should at least move the link from the configuration section here.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And let's link "OpenAI-compatible API" to the section that explains those on the openai docs page.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: review all newly created model pages. Remove initial paragraph (we didn't have it before)


## Install

To use Azure AI Foundry, you need to either install `pydantic-ai`, or install `pydantic-ai-slim` with the `openai` optional group:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The "openai" will be confusing/ look likea typo here unless we mention teh "via OpenAI compat API" thing

@@ -0,0 +1,59 @@
# DeepSeek

DeepSeek provides high-performance AI models with an OpenAI-compatible API.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Drop the "high-performance" (or the entire paragraph) please, that's too marketing-y for a docs page, and having it here but not elsewhere implies that the other models are not high performance.

@@ -0,0 +1,39 @@
# LiteLLM

LiteLLM provides a unified interface to call 100+ LLM APIs using the OpenAI format.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

consistent wording please, we don't ay "OpenAI format" on the other pages


<div class="model-card" markdown>

### Grok 4
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We call this Popular Models and honestly Grok rarely comes up on Slack, so I'd rather stick to the big 3 providers. If we look at usage data, I wouldn't be surprised if Qwen is higher than Grok for example. Maybe we should include Qwen? Then the primary ID can be Alibaba, and we can point out it can be run with Ollama etc

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean, I know x-ai gives grok code fast away but yeah people use it a lot. Funnily enough Qwen isn't currently in the ranking. But MiMo (which I never heard of) is.
image

Copy link
Collaborator

@DouweM DouweM left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this PR is worth splitting up into 2 PRs because it looks like the page reorg is closer to ready than the popular model catalog

@DouweM DouweM mentioned this pull request Jan 10, 2026
5 tasks
dsfaccini and others added 2 commits January 13, 2026 12:07
Resolved conflict in docs/models/overview.md:
- Kept frontmatter from main (SEO metadata)
- Kept our title "Models and Providers"

Co-Authored-By: Claude Opus 4.5 <[email protected]>
),
)
agent = Agent(model)
...
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we link here to other models that can also be run in azure to not seem openai-fixated

dsfaccini and others added 2 commits January 14, 2026 16:59
Addresses review comment - LiteLLM is a proxy server, so pointing
directly to OpenAI's API defeats the purpose. Simplified to show
the proxy server use case.

Co-Authored-By: Claude Opus 4.5 <[email protected]>
- xai.md: grok-2-1212 -> grok-4-fast
- vercel.md: claude-4-sonnet -> claude-sonnet-4-5

Co-Authored-By: Claude Opus 4.5 <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

awaiting author revision docs Improvements or additions to documentation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants