-
Notifications
You must be signed in to change notification settings - Fork 588
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] LiteLLMModel, ModuleNotFoundError: No module named 'cgi' #441
Labels
bug
Something isn't working
Comments
neoneye
changed the title
[BUG] ModuleNotFoundError: No module named 'cgi'
[BUG] LiteLLMModel, ModuleNotFoundError: No module named 'cgi'
Jan 30, 2025
If you want to Use Ollama models, try #368
output:
|
Or the simplest solution is to install cgi manually |
Important context:
I would say this is a bug of
Therefore, in my opinion:
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Describe the bug
There is already a litellm issue about the issue.
The
LiteLLMModel
depends on litellm which depends oncgi
.The
cgi
has recently been removed from python, pep-0594, causinglitellm
to break.Code to reproduce the error
Error logs (if any)
Expected behavior
That the ollama gets invoked, and prints out something like the following.
Packages version:
The text was updated successfully, but these errors were encountered: