Skip to content

feat: add LiteLLM as AI gateway provider#157

Open
RheagalFire wants to merge 1 commit intohydropix:mainfrom
RheagalFire:feat/add-litellm-provider
Open

feat: add LiteLLM as AI gateway provider#157
RheagalFire wants to merge 1 commit intohydropix:mainfrom
RheagalFire:feat/add-litellm-provider

Conversation

@RheagalFire
Copy link
Copy Markdown

Summary

  • Adds LiteLLM as a native provider alongside Ollama, OpenAI, Gemini, OpenRouter, Mistral, DeepSeek, and Poe
  • Enables access to 100+ LLM providers via provider-prefixed model names (e.g. anthropic/claude-sonnet-4-6, bedrock/anthropic.claude-v2)

Motivation

TranslateBooksWithLLMs already supports 7 providers through individual implementations. LiteLLM adds a unified gateway that handles provider-specific API differences, parameter translation, and authentication —
letting users access any of 100+ providers without needing a separate adapter implementation for each.

Changes

  • src/core/llm/providers/litellm.py — New LiteLLMProvider extending LLMProvider ABC with generate(), retry logic with transient error detection, and context overflow handling
  • src/core/llm/factory.py — Added "litellm" case to create_llm_provider() factory
  • requirements.txt — Added commented optional litellm>=1.65,<1.85 dependency
  • tests/unit/test_litellm_provider.py — 4 unit tests covering completion dispatch, credential forwarding, system prompt handling, and factory integration

Tests

1. Unit tests: pytest tests/unit/test_litellm_provider.py -v
tests/unit/test_litellm_provider.py::test_generate_calls_acompletion PASSED
tests/unit/test_litellm_provider.py::test_generate_omits_blank_credentials PASSED
tests/unit/test_litellm_provider.py::test_generate_forwards_system_prompt PASSED
tests/unit/test_litellm_provider.py::test_factory_creates_litellm_provider PASSED
4 passed in 0.12s

Risk / Compatibility

  • Additive only — existing providers untouched
  • litellm is optional — not added to core requirements
  • LiteLLM lazy-imported inside generate() to avoid import errors when not installed
  • drop_params=True by default for cross-provider kwarg compatibility

Example usage

from src.core.llm.factory import create_llm_provider
                                                                                                                                                                                                                   
provider = create_llm_provider("litellm", model="anthropic/claude-sonnet-4-6", api_key="sk-...")
response = await provider.generate("Translate: Hello world", system_prompt="Translate to French")                                                                                                                  
print(response.content)                                                                                                                                                                                            

Or via CLI: set LLM_PROVIDER=litellm and MODEL=anthropic/claude-sonnet-4-6 in your .env.

@RheagalFire
Copy link
Copy Markdown
Author

cc @hydropix

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant