Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: litellm throwing AttributeError: __annotations__ and standard_logging_object is None #9424

Closed
jamesbraza opened this issue Mar 20, 2025 · 4 comments · Fixed by #9455
Closed
Labels
bug Something isn't working

Comments

@jamesbraza
Copy link
Contributor

What happened?

Trying to pull in latest litellm in Future-House/paper-qa#914 I get blown up

Relevant log output

ERROR    LiteLLM:litellm_logging.py:3525 Error creating standard logging object - __annotations__
Traceback (most recent call last):
  File "/home/runner/work/paper-qa/paper-qa/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/litellm_logging.py", line 3507, in get_standard_logging_object_payload
    model_parameters=ModelParamHelper.get_standard_logging_model_parameters(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/work/paper-qa/paper-qa/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 28, in get_standard_logging_model_parameters
    ModelParamHelper._get_relevant_args_to_use_for_logging()
  File "/home/runner/work/paper-qa/paper-qa/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 45, in _get_relevant_args_to_use_for_logging
    all_openai_llm_api_params = ModelParamHelper._get_all_llm_api_params()
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/work/paper-qa/paper-qa/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 65, in _get_all_llm_api_params
    ModelParamHelper._get_litellm_supported_transcription_kwargs()
  File "/home/runner/work/paper-qa/paper-qa/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 126, in _get_litellm_supported_transcription_kwargs
    return set(TranscriptionCreateParams.__annotations__.keys())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/.local/share/uv/python/cpython-3.11.11-linux-x86_64-gnu/lib/python3.11/typing.py", line 1318, in __getattr__
    raise AttributeError(attr)
AttributeError: __annotations__
ERROR    LiteLLM Router:router.py:3814 litellm.router.Router::deployment_callback_on_success(): Exception occured - standard_logging_object is None
Traceback (most recent call last):
  File "/home/runner/work/paper-qa/paper-qa/.venv/lib/python3.11/site-packages/litellm/router.py", line 3754, in deployment_callback_on_success
    raise ValueError("standard_logging_object is None")
ValueError: standard_logging_object is None
ERROR    LiteLLM Router:router.py:3814 litellm.router.Router::deployment_callback_on_success(): Exception occured - standard_logging_object is None
Traceback (most recent call last):
  File "/home/runner/work/paper-qa/paper-qa/.venv/lib/python3.11/site-packages/litellm/router.py", line 3754, in deployment_callback_on_success
    raise ValueError("standard_logging_object is None")
ValueError: standard_logging_object is None
ERROR    LiteLLM Router:router.py:3814 litellm.router.Router::deployment_callback_on_success(): Exception occured - standard_logging_object is None
Traceback (most recent call last):
  File "/home/runner/work/paper-qa/paper-qa/.venv/lib/python3.11/site-packages/litellm/router.py", line 3754, in deployment_callback_on_success
    raise ValueError("standard_logging_object is None")
ValueError: standard_logging_object is None
ERROR    LiteLLM Router:router.py:3814 litellm.router.Router::deployment_callback_on_success(): Exception occured - standard_logging_object is None
Traceback (most recent call last):
  File "/home/runner/work/paper-qa/paper-qa/.venv/lib/python3.11/site-packages/litellm/router.py", line 3754, in deployment_callback_on_success
    raise ValueError("standard_logging_object is None")
ValueError: standard_logging_object is None

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.63.12

Twitter / LinkedIn details

No response

@jamesbraza jamesbraza added the bug Something isn't working label Mar 20, 2025
@ishaan-jaff
Copy link
Contributor

this gets raised on the latest versions of openai python, could we get help resolving this with a contributor PR ?

We need a simple way to get the supported params for each OpenAI call type

@ishaan-jaff
Copy link
Contributor

@jamesbraza if you pin to openai=1.61.0 this will not be an issue (while we are resolving this)

@jk-vtp-one
Copy link

Ran into this same error:

LiteLLM:ERROR: litellm_logging.py:3525 - Error creating standard logging object - __annotations__
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/litellm/litellm_core_utils/litellm_logging.py", line 3507, in get_standard_logging_object_payload
    model_parameters=ModelParamHelper.get_standard_logging_model_parameters(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 28, in get_standard_logging_model_parameters
    ModelParamHelper._get_relevant_args_to_use_for_logging()
  File "/usr/local/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 45, in _get_relevant_args_to_use_for_logging
    all_openai_llm_api_params = ModelParamHelper._get_all_llm_api_params()
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 65, in _get_all_llm_api_params
    ModelParamHelper._get_litellm_supported_transcription_kwargs()
  File "/usr/local/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 126, in _get_litellm_supported_transcription_kwargs
    return set(TranscriptionCreateParams.__annotations__.keys())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/typing.py", line 1318, in __getattr__
    raise AttributeError(attr)
AttributeError: __annotations__

The issue is caused by this change in OpenAI:
openai/openai-python@2b4bc75#diff-a404bd516635211261f865f8a8a478d3254d2720a914d715337b6113b41acfc6R20

https://github.com/openai/openai-python/blob/main/src/openai/types/audio/transcription_create_params.py

Which has refactored the TranscriptionCreateParams into three classes:

  • TranscriptionCreateParamsBase
  • TranscriptionCreateParamsNonStreaming
  • TranscriptionCreateParamsStreaming

The TranscriptionCreateParams object is still there, it is now a type Union:

TranscriptionCreateParams = Union[TranscriptionCreateParamsNonStreaming, TranscriptionCreateParamsStreaming]

Which causes the error.

I'm not sure how this should be fixed since I don't know what this Transcription class is being used for on either the LiteLLM side or the OpenAI side.

Changing the LiteLLM function to use the new base class will stop the error, but not fix the functionality.
https://github.com/BerriAI/litellm/blob/main/litellm/litellm_core_utils/model_param_helper.py#L126

return set(TranscriptionCreateParamsBase.__annotations__.keys())

The Streaming and NonStreaming class both include an additional typed parameter that isn't on the base class:

class TranscriptionCreateParamsNonStreaming(TranscriptionCreateParamsBase, total=False):
     stream: Optional[Literal[False]]
     """
     If set to true, the model response data will be streamed to the client as it is
     generated using
     [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format).
     See the
     [Streaming section of the Speech-to-Text guide](https://platform.openai.com/docs/guides/speech-to-text?lang=curl#streaming-transcriptions)
     for more information.
 
     Note: Streaming is not supported for the `whisper-1` model and will be ignored.
     """
 
 
 class TranscriptionCreateParamsStreaming(TranscriptionCreateParamsBase):
     stream: Required[Literal[True]]
     """
     If set to true, the model response data will be streamed to the client as it is
     generated using
     [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format).
     See the
     [Streaming section of the Speech-to-Text guide](https://platform.openai.com/docs/guides/speech-to-text?lang=curl#streaming-transcriptions)
     for more information.
 
     Note: Streaming is not supported for the `whisper-1` model and will be ignored.
     """

I don't know why they made the two classes like this, but stream is typed differently on each of them.

If LiteLLM is using these classes for Transcription, it will need to incorporate these new classes elsewhere.

@ishaan-jaff
Copy link
Contributor

Hi @jk-vtp-one can you help with a PR to fix the immediate issue

The point of that method was to get the optional params for a transcription request. We can modify it to return the base params

peteski22 added a commit to mozilla-ai/lumigator that referenced this issue Mar 25, 2025
peteski22 added a commit to mozilla-ai/lumigator that referenced this issue Mar 25, 2025
starpit added a commit to starpit/prompt-declaration-language that referenced this issue Mar 25, 2025
starpit added a commit to starpit/prompt-declaration-language that referenced this issue Mar 25, 2025
starpit added a commit to starpit/prompt-declaration-language that referenced this issue Mar 25, 2025
starpit added a commit to starpit/prompt-declaration-language that referenced this issue Mar 25, 2025
This is until we can pick up a litellm fix.

BerriAI/litellm#9424

Signed-off-by: Nick Mitchell <[email protected]>
starpit added a commit to IBM/prompt-declaration-language that referenced this issue Mar 25, 2025
This is until we can pick up a litellm fix.

BerriAI/litellm#9424

Signed-off-by: Nick Mitchell <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
3 participants