Skip to content

feat: add MiniMax as first-class LLM provider with preset system#156

Open
octo-patch wants to merge 1 commit intodenizsafak:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as first-class LLM provider with preset system#156
octo-patch wants to merge 1 commit intodenizsafak:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

  • Add a provider preset system to the LLM settings page — a dropdown lets users pick MiniMax, OpenAI, DeepSeek, or Ollama and auto-fills the endpoint URL, model list, and API-key hint
  • New abogen/llm_providers.py module with LLMProviderPreset dataclass and 4 built-in presets (MiniMax at https://api.minimax.io/v1, OpenAI, DeepSeek, Ollama)
  • Custom endpoints still work via the "Custom endpoint" option — no breaking changes

Changes

File What changed
abogen/llm_providers.py New module: LLMProviderPreset dataclass, built-in presets, get_provider_presets() and get_provider_by_id()
abogen/normalization_settings.py Added llm_provider key to settings defaults
abogen/webui/templates/settings.html Provider <select> dropdown in LLM settings section
abogen/webui/static/settings.js applyProviderPreset() auto-fill logic + initProviderDropdown()
abogen/webui/routes/settings.py Pass llm_provider_presets to template
abogen/webui/routes/utils/settings.py Thread llm_provider through settings pipeline
README.md Provider table and MiniMax configuration docs
.env.example MiniMax Cloud commented-out example alongside Ollama

Test plan

  • 10 unit tests for llm_providers.py (preset lookup, uniqueness, frozen dataclass, to_dict, etc.)
  • 3 integration tests (settings pipeline round-trip, build_llm_configuration with MiniMax preset)
  • All 13 new tests passing
  • Manual: select MiniMax from dropdown -> verify base URL, models, and hint auto-fill
  • Manual: select "Custom endpoint" -> verify fields are editable as before

Add a provider dropdown to the LLM settings page so users can pick
MiniMax, OpenAI, DeepSeek, or Ollama and have the endpoint URL, model
list, and API-key hint auto-filled.  Custom endpoints still work via
the "Custom endpoint" option.

- New `abogen/llm_providers.py` — `LLMProviderPreset` dataclass and
  built-in presets (MiniMax, OpenAI, DeepSeek, Ollama)
- Settings UI: provider `<select>` dropdown with JS auto-fill logic
- Backend: `llm_provider` key threaded through settings pipeline
- README: provider table and MiniMax configuration docs
- `.env.example`: MiniMax Cloud commented-out example
- 10 unit tests + 3 integration tests (all passing)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant