Skip to content

feat(providers): add first-class Qianfan provider support#922

Open
jimmyzhuu wants to merge 1 commit intokatanemo:mainfrom
jimmyzhuu:codex/add-qianfan-provider
Open

feat(providers): add first-class Qianfan provider support#922
jimmyzhuu wants to merge 1 commit intokatanemo:mainfrom
jimmyzhuu:codex/add-qianfan-provider

Conversation

@jimmyzhuu
Copy link
Copy Markdown

What changed

  • Adds qianfan as a first-class OpenAI-compatible provider.
  • Registers qianfan/* in the CLI config generator and JSON schema.
  • Adds zero-config defaults using QIANFAN_API_KEY and https://qianfan.baidubce.com/v2.
  • Adds ProviderId::Qianfan / LlmProviderType::Qianfan with parse, display, and round-trip coverage.
  • Routes OpenAI-compatible client requests from /v1/chat/completions to Qianfan's /v2/chat/completions upstream path.
  • Documents Baidu Qianfan configuration examples.

Why this change is needed

Qianfan supports OpenAI-compatible chat completions, but using it today requires users to configure a custom provider with both base_url and provider_interface: openai.

This makes Qianfan match the existing first-class provider flow used by other OpenAI-compatible providers: users can configure qianfan/* directly, use QIANFAN_API_KEY in zero-config setups, and rely on Plano's validation and routing behavior.

User impact

Users can configure Qianfan with:

model_providers:
  - model: qianfan/*
    access_key: $QIANFAN_API_KEY

or a specific model:

model_providers:
  - model: qianfan/ernie-4.0-turbo-8k
    access_key: $QIANFAN_API_KEY

Plano routes OpenAI-compatible chat completion requests to https://qianfan.baidubce.com/v2/chat/completions.

Tests / validation

  • cd cli && uv run pytest -q
    • 102 passed
  • cd crates && cargo fmt --all -- --check
  • cd crates && cargo test -p hermesllm test_qianfan --lib
    • 2 passed
  • cd crates && cargo test -p common test_llm_provider_type_qianfan_roundtrip --lib
    • 1 passed
  • git diff --check

Notes for reviewers

  • This PR intentionally keeps Qianfan as an OpenAI-compatible provider and does not add native Qianfan-specific request/response transforms.
  • baidu is accepted as a provider alias in ProviderId, but the canonical config prefix is qianfan.

@jimmyzhuu jimmyzhuu marked this pull request as ready for review April 25, 2026 05:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant