Skip to content

Try the SDK with Python 3.14 #4608

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 3 commits into
base: master
Choose a base branch
from
Draft

Try the SDK with Python 3.14 #4608

wants to merge 3 commits into from

Conversation

sentrivana
Copy link
Contributor

@sentrivana sentrivana commented Jul 22, 2025

Just an experiment to see how badly things are broken.

Copy link

codecov bot commented Jul 22, 2025

❌ 55 Tests Failed:

Tests completed Failed Passed Skipped
25241 55 25186 2468
View the top 3 failed test(s) by shortest run time
tests.integrations.huggingface_hub.test_huggingface_hub::test_bad_chat_completion
Stack Traces | 0.104s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:149: in test_bad_chat_completion
    client.text_generation(prompt="hello")
sentry_sdk/integrations/huggingface_hub.py:87: in new_text_generation
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:83: in new_text_generation
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v0.30.2/lib/python3.12.../huggingface_hub/inference/_client.py:2351: in text_generation
    request_parameters = provider_helper.prepare_request(
.tox/py3.12-huggingface_hub-v0.30.2/lib/python3.12.../inference/_providers/_common.py:64: in prepare_request
    mapped_model = self._prepare_mapped_model(model)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v0.30.2/lib/python3.12.../inference/_providers/hf_inference.py:35: in _prepare_mapped_model
    _check_supported_task(model_id, self.task)
.tox/py3.12-huggingface_hub-v0.30.2/lib/python3.12.../inference/_providers/hf_inference.py:164: in _check_supported_task
    raise ValueError(
E   ValueError: Model 'mistralai/Mistral-Nemo-Instruct-2407' doesn't support task 'text-generation'. Supported tasks: 'None', got: 'text-generation'
tests.integrations.huggingface_hub.test_huggingface_hub::test_nonstreaming_chat_completion[False-False-True]
Stack Traces | 0.104s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:56: in test_nonstreaming_chat_completion
    response = client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:87: in new_text_generation
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:83: in new_text_generation
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v0.30.2/lib/python3.12.../huggingface_hub/inference/_client.py:2351: in text_generation
    request_parameters = provider_helper.prepare_request(
.tox/py3.12-huggingface_hub-v0.30.2/lib/python3.12.../inference/_providers/_common.py:64: in prepare_request
    mapped_model = self._prepare_mapped_model(model)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v0.30.2/lib/python3.12.../inference/_providers/hf_inference.py:35: in _prepare_mapped_model
    _check_supported_task(model_id, self.task)
.tox/py3.12-huggingface_hub-v0.30.2/lib/python3.12.../inference/_providers/hf_inference.py:164: in _check_supported_task
    raise ValueError(
E   ValueError: Model 'mistralai/Mistral-Nemo-Instruct-2407' doesn't support task 'text-generation'. Supported tasks: 'None', got: 'text-generation'
tests.integrations.huggingface_hub.test_huggingface_hub::test_nonstreaming_chat_completion[False-True-False]
Stack Traces | 0.104s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:56: in test_nonstreaming_chat_completion
    response = client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:87: in new_text_generation
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:83: in new_text_generation
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v0.30.2/lib/python3.12.../huggingface_hub/inference/_client.py:2351: in text_generation
    request_parameters = provider_helper.prepare_request(
.tox/py3.12-huggingface_hub-v0.30.2/lib/python3.12.../inference/_providers/_common.py:64: in prepare_request
    mapped_model = self._prepare_mapped_model(model)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v0.30.2/lib/python3.12.../inference/_providers/hf_inference.py:35: in _prepare_mapped_model
    _check_supported_task(model_id, self.task)
.tox/py3.12-huggingface_hub-v0.30.2/lib/python3.12.../inference/_providers/hf_inference.py:164: in _check_supported_task
    raise ValueError(
E   ValueError: Model 'mistralai/Mistral-Nemo-Instruct-2407' doesn't support task 'text-generation'. Supported tasks: 'None', got: 'text-generation'

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant