Skip to content

Conversation

@pokemon9757
Copy link

Made this provider code compatible with the README instructions. Quote README: "To use local LLM, comment out OPENAI_KEY and instead uncomment OPENAI_ENDPOINT and OPENAI_MODEL" The previous version would not work if user leaves the OPEN_AI key empty OPENAI_KEY="" and attempts to use a local model (e.g., coming from LMStudio)

Made this provider code compatible with the README instructions. 
Quote README: "To use local LLM, comment out OPENAI_KEY and instead uncomment OPENAI_ENDPOINT and OPENAI_MODEL"
The previous version would not work if user leaves the OPEN_AI key empty `OPENAI_KEY=""` and attempts to use a local model (e.g., coming from LMStudio)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant