Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] LiteLLMModel, ModuleNotFoundError: No module named 'cgi' #441

Open
neoneye opened this issue Jan 30, 2025 · 3 comments
Open

[BUG] LiteLLMModel, ModuleNotFoundError: No module named 'cgi' #441

neoneye opened this issue Jan 30, 2025 · 3 comments
Labels
bug Something isn't working

Comments

@neoneye
Copy link

neoneye commented Jan 30, 2025

Describe the bug

There is already a litellm issue about the issue.

The LiteLLMModel depends on litellm which depends on cgi.

The cgi has recently been removed from python, pep-0594, causing litellm to break.

Code to reproduce the error

from smolagents import LiteLLMModel

model = LiteLLMModel(
    model_id="ollama_chat/llama3.1:latest",
    # api_key="ollama",
    # api_base="http://localhost:11434",
)

messages = [{"role": "user", "content": "What is the capital of Mexico?"}]
response = model(messages, temperature=0.8)
print(response)

Error logs (if any)

(venv) PROMPT> python --version
Python 3.13.1

(venv) PROMPT> python -m src.proof_of_concepts.run_chat_ollama
/path/to/venv/lib/python3.13/site-packages/pydantic/_internal/_config.py:345: UserWarning: Valid config keys have changed in V2:
* 'fields' has been removed
  warnings.warn(message, UserWarning)
Traceback (most recent call last):
  File "/path/to/venv/lib/python3.13/site-packages/smolagents/models.py", line 642, in __init__
    import litellm
  File "/path/to/venv/lib/python3.13/site-packages/litellm/__init__.py", line 1061, in <module>
    from .llms.bedrock.chat.converse_transformation import AmazonConverseConfig
  File "/path/to/venv/lib/python3.13/site-packages/litellm/llms/bedrock/chat/__init__.py", line 1, in <module>
    from .converse_handler import BedrockConverseLLM
  File "/path/to/venv/lib/python3.13/site-packages/litellm/llms/bedrock/chat/converse_handler.py", line 20, in <module>
    from .invoke_handler import AWSEventStreamDecoder, MockResponseIterator, make_call
  File "/path/to/venv/lib/python3.13/site-packages/litellm/llms/bedrock/chat/invoke_handler.py", line 32, in <module>
    from litellm.litellm_core_utils.prompt_templates.factory import (
    ...<7 lines>...
    )
  File "/path/to/venv/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 2156, in <module>
    from cgi import parse_header
ModuleNotFoundError: No module named 'cgi'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/path/to/src/proof_of_concepts/run_chat_ollama.py", line 9, in <module>
    model = LiteLLMModel(
        model_id="ollama_chat/llama3.1:latest",
        # api_key="ollama",
        # api_base="http://localhost:11434",
    )
  File "/path/to/venv/lib/python3.13/site-packages/smolagents/models.py", line 644, in __init__
    raise ModuleNotFoundError(
        "Please install 'litellm' extra to use LiteLLMModel: `pip install 'smolagents[litellm]'`"
    )
ModuleNotFoundError: Please install 'litellm' extra to use LiteLLMModel: `pip install 'smolagents[litellm]'`

Expected behavior

That the ollama gets invoked, and prints out something like the following.

PROMPT> python -m src.proof_of_concepts.run_chat_ollama
ChatMessage(role='assistant', content='The capital of Mexico is **Mexico City** (Ciudad de México in Spanish). It is the largest city in Mexico and one of the most populous cities in the world. Mexico City serves as the political, cultural, and economic center of the country.', tool_calls=None)

Packages version:

smolagents==1.6.0
litellm==1.59.10
@neoneye neoneye added the bug Something isn't working label Jan 30, 2025
@neoneye neoneye changed the title [BUG] ModuleNotFoundError: No module named 'cgi' [BUG] LiteLLMModel, ModuleNotFoundError: No module named 'cgi' Jan 30, 2025
@touseefahmed96
Copy link
Contributor

If you want to Use Ollama models, try #368

from smolagents import OllamaModel

model = OllamaModel(
    model_id="llama3.1:latest",
    # api_key="ollama",
    # api_base="http://localhost:11434",
)

messages = [{"role": "user", "content": "What is the capital of Mexico?"}]
response = model(messages, temperature=0.8)
print(response)

output:

ChatMessage(role='assistant', content='The capital of Mexico is Mexico City (Ciudad de México in Spanish).', tool_calls=None)

@touseefahmed96
Copy link
Contributor

Or the simplest solution is to install cgi manually pip install legacy-cgi

@albertvillanova
Copy link
Member

albertvillanova commented Jan 31, 2025

Important context:

  • This issue only affects Python >= 3.13.

I would say this is a bug of litellm:

Therefore, in my opinion:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants