Skip to content

feat: add remote embedding providers (Voyage AI, OpenAI-compatible)#65

Open
adam91holt wants to merge 1 commit intotobi:mainfrom
adam91holt:feat/remote-providers
Open

feat: add remote embedding providers (Voyage AI, OpenAI-compatible)#65
adam91holt wants to merge 1 commit intotobi:mainfrom
adam91holt:feat/remote-providers

Conversation

@adam91holt
Copy link

Summary

Add support for remote embedding APIs as an alternative to local models. This enables:

  • Faster embeddings without requiring a GPU
  • Higher quality embeddings via Voyage AI
  • Flexibility to use any OpenAI-compatible API

Changes

New Features

  • QMD_PROVIDER env var to select provider: voyage, openai, or local (default)
  • Voyage AI support: voyage-4-lite embeddings + rerank-2 reranking
  • OpenAI-compatible support: works with OpenAI, Ollama, vLLM, LM Studio, etc.
  • Provider info shown in qmd status output

Technical Details

  • New RemoteLLM class implementing the LLM interface
  • getDefaultLLM() factory function for provider selection
  • Query expansion stays local (LlamaCpp) for all providers
  • 15 new tests for remote provider functionality

Files Changed

  • src/remote.ts - New RemoteLLM implementation
  • src/remote.test.ts - Test suite for remote providers
  • src/llm.ts - Provider selection functions
  • src/store.ts - Use getDefaultLLM for embed/rerank
  • src/qmd.ts - Show provider in status
  • README.md - Documentation for env vars

Environment Variables

Variable Default Description
QMD_PROVIDER local Provider: local, voyage, openai
VOYAGE_API_KEY - Voyage AI API key
VOYAGE_EMBED_MODEL voyage-4-lite Voyage embedding model
VOYAGE_RERANK_MODEL rerank-2 Voyage reranking model
OPENAI_API_KEY - OpenAI API key
OPENAI_EMBED_MODEL text-embedding-3-small OpenAI embedding model
OPENAI_API_BASE https://api.openai.com/v1 Base URL for OpenAI-compatible APIs

Testing

All 15 new tests pass with actual Voyage API calls:

bun test src/remote.test.ts
15 pass, 0 fail

Tests skip gracefully when API keys are not set.

@adam91holt adam91holt force-pushed the feat/remote-providers branch from 0b6196d to 15e13f6 Compare January 28, 2026 13:20
@jorgecolonconsulting
Copy link

This is awesome. It's exactly what I was hoping for. Thanks Adam! @adam91holt

- QMD_PROVIDER env var to select provider: voyage, openai, or local (default)
- Voyage AI support: voyage-4-lite embeddings + rerank-2 reranking
- OpenAI-compatible support: works with OpenAI, Ollama, vLLM, LM Studio, etc.
- Provider info shown in qmd status output
- Add environment variables to --help output
@adam91holt adam91holt force-pushed the feat/remote-providers branch from 15e13f6 to 18badc5 Compare January 29, 2026 00:03
@mcinteerj
Copy link

mcinteerj commented Feb 8, 2026

I'd love to see this merged! Very keen to use qmd with openclaw, but really like working with Voyage AI's embedding models and hence don't want to move across until this is supported.

Voyage AI have also released voyage-4-nano which can be run locally so can keep inference/query completely local which feels aligned with the project's intent. Shared embedding space means we have the option for remote, large, models to embed the docs and then smaller local for queries. Although will need another PR to fully enable it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants