-
-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: litellm
throwing AttributeError: __annotations__
and standard_logging_object is None
#9424
Comments
this gets raised on the latest versions of openai python, could we get help resolving this with a contributor PR ? We need a simple way to get the supported params for each OpenAI call type |
@jamesbraza if you pin to openai=1.61.0 this will not be an issue (while we are resolving this) |
Ran into this same error: LiteLLM:ERROR: litellm_logging.py:3525 - Error creating standard logging object - __annotations__
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/litellm/litellm_core_utils/litellm_logging.py", line 3507, in get_standard_logging_object_payload
model_parameters=ModelParamHelper.get_standard_logging_model_parameters(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 28, in get_standard_logging_model_parameters
ModelParamHelper._get_relevant_args_to_use_for_logging()
File "/usr/local/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 45, in _get_relevant_args_to_use_for_logging
all_openai_llm_api_params = ModelParamHelper._get_all_llm_api_params()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 65, in _get_all_llm_api_params
ModelParamHelper._get_litellm_supported_transcription_kwargs()
File "/usr/local/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 126, in _get_litellm_supported_transcription_kwargs
return set(TranscriptionCreateParams.__annotations__.keys())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/typing.py", line 1318, in __getattr__
raise AttributeError(attr)
AttributeError: __annotations__ The issue is caused by this change in OpenAI: Which has refactored the TranscriptionCreateParams into three classes:
The TranscriptionCreateParams object is still there, it is now a type Union: TranscriptionCreateParams = Union[TranscriptionCreateParamsNonStreaming, TranscriptionCreateParamsStreaming] Which causes the error. I'm not sure how this should be fixed since I don't know what this Transcription class is being used for on either the LiteLLM side or the OpenAI side. Changing the LiteLLM function to use the new base class will stop the error, but not fix the functionality. return set(TranscriptionCreateParamsBase.__annotations__.keys()) The Streaming and NonStreaming class both include an additional typed parameter that isn't on the base class: class TranscriptionCreateParamsNonStreaming(TranscriptionCreateParamsBase, total=False):
stream: Optional[Literal[False]]
"""
If set to true, the model response data will be streamed to the client as it is
generated using
[server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format).
See the
[Streaming section of the Speech-to-Text guide](https://platform.openai.com/docs/guides/speech-to-text?lang=curl#streaming-transcriptions)
for more information.
Note: Streaming is not supported for the `whisper-1` model and will be ignored.
"""
class TranscriptionCreateParamsStreaming(TranscriptionCreateParamsBase):
stream: Required[Literal[True]]
"""
If set to true, the model response data will be streamed to the client as it is
generated using
[server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format).
See the
[Streaming section of the Speech-to-Text guide](https://platform.openai.com/docs/guides/speech-to-text?lang=curl#streaming-transcriptions)
for more information.
Note: Streaming is not supported for the `whisper-1` model and will be ignored.
""" I don't know why they made the two classes like this, but stream is typed differently on each of them. If LiteLLM is using these classes for Transcription, it will need to incorporate these new classes elsewhere. |
Hi @jk-vtp-one can you help with a PR to fix the immediate issue The point of that method was to get the optional params for a transcription request. We can modify it to return the base params |
Pin openai=1.61.0 BerriAI/litellm#9424 Signed-off-by: Nick Mitchell <[email protected]>
Pin openai=1.61.0 BerriAI/litellm#9424 Signed-off-by: Nick Mitchell <[email protected]>
Pin openai=1.61.0 BerriAI/litellm#9424 Signed-off-by: Nick Mitchell <[email protected]>
This is until we can pick up a litellm fix. BerriAI/litellm#9424 Signed-off-by: Nick Mitchell <[email protected]>
This is until we can pick up a litellm fix. BerriAI/litellm#9424 Signed-off-by: Nick Mitchell <[email protected]>
What happened?
Trying to pull in latest
litellm
in Future-House/paper-qa#914 I get blown upRelevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.63.12
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: