Skip to content

[Bug]: allowed_openai_params in config.yaml not working #15845

@noahpodgurski

Description

@noahpodgurski

What happened?

Setting allowed_openai_params: ["tools"] under

- model_name: model
  litellm_params:
    allowed_openai_params: ["tools"]

is not respected in requests. The only way I could get it working was attaching "allowed_openai_params": ["tools"] in the body of my requests. It should be enabled by default with this config option

Relevant log output

litellm.UnsupportedParamsError: together_ai does not support parameters: ['tools'], for model=XYZ To drop these, set `litellm.drop_params=True` or for proxy:\n\n`litellm_settings:\n drop_params: true`\n. \n If you want to use these params dynamically send allowed_openai_params=['tools'] in your request..

The model I'm testing with is a TogetherAI model with a custom name, is it possible this is related to the bug? Do custom models need to be explicitly given support somewhere else in the config?

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.77.3-stable

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions