-
Notifications
You must be signed in to change notification settings - Fork 882
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] DeepSeek Reasoner rejects system messages through LiteLLM with invalid message format error #430
Comments
if you using ollama, it is because litellm include format:json in the request, wich is bugging on ollama with this model actually. i made a custom model and it is warking |
I'm using litellm directly.
…On Sat, Feb 1, 2025 at 2:32 PM elsolo5000-2 ***@***.***> wrote:
if you using ollama, it is because litellm include format:json in the
request, wich is bugging on ollama with this model actually. i made a
custom model and it is warking
—
Reply to this email directly, view it on GitHub
<#430 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ASVG6CXS5UIRBKNOIW7PEEL2NTLHXAVCNFSM6AAAAABWFBV4D2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMMRYHE3TKNRWGY>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
I will compare litellm output with the deepseek doc to see. Last time i checked they re repo they where aving problems too. I think someone made a special model id prefix for reasoner models on the dev version |
Error code: 400 - {'error': {'message': 'The last message of deepseek-reasoner must be a user message, or an assistant message with prefix mode on (refer to https://api-docs.deepseek.com/guides/chat_prefix_completion).', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}} litellm seems to output the correct format, but perhaps we send a chat history with role:assistant as last message, wich is a beta feature of deepseek reasoner. see i did not tried on deepseek api i just watched logs from wireshark on a personal endpoint |
Describe the bug
When using LiteLLM to interact with DeepSeek Reasoner's API, the API rejects requests that include system messages with a 400 error. The error message indicates that "The last message of deepseek-reasoner must be a user message, or an assistant message with prefix mode on", suggesting that DeepSeek Reasoner has specific requirements about message ordering and format that aren't being handled correctly by the current LiteLLM integration.
Code to reproduce the error
Error logs (if any)
Expected behavior
Should search for weather and respond based on search results.
Packages version:
Additional context
Note that a simple LiteLLM-only test script like this works fine:
The text was updated successfully, but these errors were encountered: