-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Add model catalog #3941
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Add model catalog #3941
Conversation
Docs Preview
|
DouweM
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this PR is worth splitting up into 2 PRs because it looks like the page reorg is closer to ready than the popular model catalog
docs/models/alibaba.md
Outdated
| @@ -0,0 +1,55 @@ | |||
| # Alibaba Cloud Model Studio (DashScope) | |||
|
|
|||
| Alibaba Cloud Model Studio (DashScope) provides access to Qwen models via an OpenAI-compatible API. | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we're going to have this initial paragraph (note that OpenAI, Anthropic, etc do not), we should at least move the link from the configuration section here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And let's link "OpenAI-compatible API" to the section that explains those on the openai docs page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: review all newly created model pages. Remove initial paragraph (we didn't have it before)
docs/models/azure.md
Outdated
|
|
||
| ## Install | ||
|
|
||
| To use Azure AI Foundry, you need to either install `pydantic-ai`, or install `pydantic-ai-slim` with the `openai` optional group: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The "openai" will be confusing/ look likea typo here unless we mention teh "via OpenAI compat API" thing
docs/models/deepseek.md
Outdated
| @@ -0,0 +1,59 @@ | |||
| # DeepSeek | |||
|
|
|||
| DeepSeek provides high-performance AI models with an OpenAI-compatible API. | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Drop the "high-performance" (or the entire paragraph) please, that's too marketing-y for a docs page, and having it here but not elsewhere implies that the other models are not high performance.
docs/models/litellm.md
Outdated
| @@ -0,0 +1,39 @@ | |||
| # LiteLLM | |||
|
|
|||
| LiteLLM provides a unified interface to call 100+ LLM APIs using the OpenAI format. | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
consistent wording please, we don't ay "OpenAI format" on the other pages
|
|
||
| <div class="model-card" markdown> | ||
|
|
||
| ### Grok 4 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We call this Popular Models and honestly Grok rarely comes up on Slack, so I'd rather stick to the big 3 providers. If we look at usage data, I wouldn't be surprised if Qwen is higher than Grok for example. Maybe we should include Qwen? Then the primary ID can be Alibaba, and we can point out it can be run with Ollama etc
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DouweM
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this PR is worth splitting up into 2 PRs because it looks like the page reorg is closer to ready than the popular model catalog
Resolved conflict in docs/models/overview.md: - Kept frontmatter from main (SEO metadata) - Kept our title "Models and Providers" Co-Authored-By: Claude Opus 4.5 <[email protected]>
| ), | ||
| ) | ||
| agent = Agent(model) | ||
| ... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we link here to other models that can also be run in azure to not seem openai-fixated
Addresses review comment - LiteLLM is a proxy server, so pointing directly to OpenAI's API defeats the purpose. Simplified to show the proxy server use case. Co-Authored-By: Claude Opus 4.5 <[email protected]>
- xai.md: grok-2-1212 -> grok-4-fast - vercel.md: claude-4-sonnet -> claude-sonnet-4-5 Co-Authored-By: Claude Opus 4.5 <[email protected]>

result.mp4