Skip to content

feat: add MiniMax as direct LLM provider for podcast generation#4

Open
octo-patch wants to merge 1 commit intopanyanyany:masterfrom
octo-patch:feature/add-minimax-llm-provider
Open

feat: add MiniMax as direct LLM provider for podcast generation#4
octo-patch wants to merge 1 commit intopanyanyany:masterfrom
octo-patch:feature/add-minimax-llm-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Adds MiniMax M2.7 as a first-class LLM provider alongside OpenRouter and xAI. Since Twocast already uses MiniMax for TTS, users can now reuse the same MINIMAX_TOKEN for LLM by simply setting LLM_PROVIDER=minimax.

What changed

  • New src/utils/llm.ts: Provider preset system with auto-detection, temperature clamping, and think-tag stripping for M2.7
  • Updated src/utils/xai.ts: Uses provider config instead of raw env vars; fully backward-compatible
  • Updated .env.example: Documented LLM_PROVIDER and MINIMAX_API_KEY
  • Updated README.md and README.zh-CN.md: Provider comparison table with MiniMax docs
  • 31 unit tests + 3 integration tests (all passing)

Why MiniMax?

  • MiniMax M2.7 offers a 1M token context window, ideal for long documents and podcast scripts
  • OpenAI-compatible API integrates seamlessly
  • Users with MINIMAX_TOKEN for TTS get LLM support with zero additional setup

Test plan

  • 31 unit tests covering provider detection, config resolution, temperature clamping, think-tag stripping
  • 3 integration tests verifying full request pipeline
  • Manual test: set LLM_PROVIDER=minimax and generate a podcast
  • Verify backward compatibility: existing OpenRouter config still works

7 files changed, 557 additions(+), 27 deletions(-)

Add multi-provider LLM support with MiniMax M2.7 as a first-class option.
Users who already have MINIMAX_TOKEN for TTS can reuse it for LLM by setting
LLM_PROVIDER=minimax — no extra API key needed.

Changes:
- New src/utils/llm.ts: provider preset system with auto-detection,
  temperature clamping (MiniMax requires >0), and think-tag stripping
- Updated src/utils/xai.ts: uses provider config instead of raw env vars
- Updated .env.example: documented LLM_PROVIDER and MINIMAX_API_KEY
- Updated README.md & README.zh-CN.md: LLM provider table with MiniMax docs
- 31 unit tests + 3 integration tests (all passing)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant