Skip to content
This repository was archived by the owner on Apr 12, 2026. It is now read-only.

feat: add Kimi (Moonshot AI) as LLM provider#177

Open
warku123 wants to merge 2 commits into
tuannvm:mainfrom
warku123:feat/add-model
Open

feat: add Kimi (Moonshot AI) as LLM provider#177
warku123 wants to merge 2 commits into
tuannvm:mainfrom
warku123:feat/add-model

Conversation

@warku123
Copy link
Copy Markdown

@warku123 warku123 commented Mar 10, 2026

Summary

Add Kimi (Moonshot AI) as a supported LLM provider, enabling users to use Kimi models through the OpenAI-compatible API gateway.

Changes

  • Register Kimi as a new LLM provider type using the OpenAI model factory
  • Add default Kimi configuration (model: kimi-k2.5, base URL: https://api.moonshot.cn/v1, temperature: 1)
  • Support KIMI_API_KEY, KIMI_MODEL, and KIMI_BASE_URL environment variable overrides
  • Add Kimi provider validation in config validation logic
  • Update config schema to include Kimi provider

Testing

  • Tests pass locally
  • New tests added (if applicable)

Related Issues

Summary by CodeRabbit

New Features

  • Added support for Kimi as an LLM provider with configurable settings via environment variables (KIMI_API_KEY, KIMI_MODEL, KIMI_BASE_URL)
  • Kimi provider now includes default configuration with model "kimi-k2.5" and pre-configured API endpoint
  • Kimi is now available as a selectable provider option alongside existing providers

warku123 and others added 2 commits March 10, 2026 18:19
Add Kimi support by reusing the OpenAI-compatible LangChain factory.
Default model is kimi-k2.5 with base URL https://api.moonshot.cn/v1.
Supports KIMI_API_KEY, KIMI_MODEL, KIMI_BASE_URL environment variables.

Co-Authored-By: Claude Opus 4.6 <[email protected]>
Signed-off-by: warku123 <[email protected]>
Signed-off-by: warku123 <[email protected]>
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Mar 10, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 5c6ce8af-702a-422d-928d-7c9165ecabed

📥 Commits

Reviewing files that changed from the base of the PR and between 131233d and 8371c8d.

📒 Files selected for processing (5)
  • internal/config/config.go
  • internal/config/validation.go
  • internal/llm/langchain.go
  • internal/llm/provider.go
  • schema/config-schema.json

Walkthrough

This pull request adds support for a new LLM provider called "Kimi" by introducing a provider identifier, default configuration values, environment variable wiring, validation rules, LangChain factory registration, and JSON schema updates to align with existing provider patterns.

Changes

Cohort / File(s) Summary
Provider Constants
internal/llm/provider.go, internal/config/config.go
Added ProviderTypeKimi and ProviderKimi constants to register the new LLM provider identifier.
Configuration Defaults & Environment Wiring
internal/config/config.go
Implemented default Kimi configuration (model: "kimi-k2.5", baseURL: "https://api.moonshot.cn/v1", temperature: 1) in applyLLMDefaults and added runtime environment variable support (KIMI_API_KEY, KIMI_MODEL, KIMI_BASE_URL) in ApplyEnvironmentVariables.
Validation
internal/config/validation.go
Added validation case for ProviderKimi to ensure APIKey is non-empty and not a placeholder, mirroring existing provider validation patterns.
LangChain Integration
internal/llm/langchain.go
Registered ProviderTypeKimi with OpenAIModelFactory in the LangChain model factory initialization.
Schema Definition
schema/config-schema.json
Extended the llm.provider enum to include "kimi" and added a corresponding kimi entry under llm.providers referencing the standard llm_provider definition.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~12 minutes

Possibly related PRs

  • tuannvm/slack-mcp-client#60: Adds OLLAMA environment variable handling in ApplyEnvironmentVariables using the same pattern now applied to Kimi.
  • tuannvm/slack-mcp-client#48: Adds support for a new LLM provider using an analogous sequence of constant declarations, default configuration, validation, and factory registration.

Poem

🐰 A Kimi hops into our config file,
With defaults set and env vars compiled,
Validation checks keep it pure and true,
LangChain factory wires it through and through,
Another provider joins the cozy crew! 🌙

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'feat: add Kimi (Moonshot AI) as LLM provider' directly and accurately summarizes the main change: adding a new LLM provider (Kimi/Moonshot AI). It is concise, clear, and specific.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Tip

Try Coding Plans. Let us write the prompt for your AI agent so you can ship faster (with fewer bugs).
Share your feedback on Discord.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant