Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

databricks configure fail;: 400 Bad Request. Message: Error: Model response did not respect the required format #1161

Closed
livshitsa opened this issue Feb 8, 2025 · 8 comments · Fixed by #1162
Assignees
Labels
bug Something isn't working

Comments

@livshitsa
Copy link

configure of databricks provider fails for databricks-meta-llama-3-3-70b-instruct
here is the error

Enter a model from that provider:
│ databricks-meta-llama-3-3-70b-instruct

◇ Request failed: Request failed with status: 400 Bad Request. Message: Error: Model response did not respect the required format. Please consider retrying or using a more straightforward prompt.


└ Failed to configure provider: init chat completion request with tool did not succeed.

@salman1993
Copy link
Collaborator

thanks for reporting this! i can replicate this on my side. will update this thread once we put in a fix.

@salman1993
Copy link
Collaborator

salman1993 commented Feb 8, 2025

we are running into this error with databricks model databricks-meta-llama-3-3-70b-instruct but works with other models such as gpt-4o.

> curl https://YOUR_HOST/serving-endpoints/databricks-meta-llama-3-3-70b-instruct/invocations \
  -u token:$DATABRICKS_TOKEN \
  -X POST \
  -H "Content-Type: application/json" \
  -d '{
    "messages": [
        {
            "role": "user",
            "content": "What is the weather like in Paris today?"
        }
    ],
    "tools": [
        {
            "type": "function",
            "function": {
                "name": "get_weather",
                "description": "Get current temperature for a given location.",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "location": {
                            "type": "string",
                            "description": "City and country e.g. Bogotá, Colombia"
                        }
                    },
                    "required": [
                        "location"
                    ],
                    "additionalProperties": false
                },
                "strict": true
            }
        }
    ]
}'

{"id":"chatcmpl_eef44b42-5ecf-43dc-a9f6-0084718ebb25","object":"chat.completion","created":1739035458,"model":"meta-llama-3.3-70b-instruct-121024","choices":[{"index":0,"message":{"role":"assistant","content":null,"tool_calls":[{"id":"call_52d676b6-b6e8-43e7-867b-396c48e77ae7","type":"function","function":{"name":"get_weather","arguments":"{\"location\": \"Paris, France\"}"}}]},"finish_reason":"tool_calls","logprobs":null}],"usage":{"prompt_tokens":719,"completion_tokens":16,"total_tokens":735}}
Image

@salman1993
Copy link
Collaborator

this PR should fix this: #1162

@salman1993
Copy link
Collaborator

you can try installing the canary:

curl -fsSL https://github.com/block/goose/releases/download/stable/download_cli.sh | CANARY=true bash

@livshitsa
Copy link
Author

thanks for fixing; the configure works now, but using goose session gives the error, i just entered "what can you do" :
│ Configure Providers

◇ Which model provider should we use?
│ Databricks

● DATABRICKS_HOST is set via environment variable

◇ Would you like to save this value to your keyring?
│ Yes

● Saved DATABRICKS_HOST to config file

◇ Enter a model from that provider:
│ databricks-meta-llama-3-3-70b-instruct

◐ Checking your configuration... └ Configuration saved successfully
maccomputer % goose session
starting session | provider: databricks model: databricks-meta-llama-3-3-70b-instruct
logging to /Users/communicator/.config/goose/sessions/KNEzs19j.jsonl

Goose is running! Enter your instructions, or try asking what goose can do.

( O)> what can you do
◒ Nurturing neural nets... 2025-02-08T19:19:05.116652Z ERROR goose::agents::truncate: Error: Request failed: Request failed with status: 400 Bad Request. Message: Bad request: rpc error: code = InvalidArgument desc = Invalid JSON schema - could not process schema

at crates/goose/src/agents/truncate.rs:279

Ran into this error: Request failed: Request failed with status: 400 Bad Request. Message: Bad request: rpc error: code = InvalidArgument desc = Invalid JSON schema - c
ould not process schema
.

Please retry if you think this is a transient or recoverable error.

@salman1993 salman1993 reopened this Feb 8, 2025
@salman1993
Copy link
Collaborator

what extensions are you using?

@livshitsa
Copy link
Author

what extensions are you using?

just dev tools

@salman1993
Copy link
Collaborator

i could replicate this. seems like an error on Databricks model with tool calling but we're trying to debug it with Databricks

@lily-de lily-de closed this as completed Feb 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants