-
-
Notifications
You must be signed in to change notification settings - Fork 12.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Ollama integration not working. #5610
Comments
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
Hey @ZaEvab55555! I'm here to help you with any bugs, questions, or contributions. Let's tackle this issue together! The "fetch failed" TypeError you're encountering with the Ollama integration in Lobe Chat on Windows 11 using Docker can be resolved by enabling the client request mode in LobeChat. When this mode is disabled, the Ollama model configuration will fail with a 'fetch failed' TypeError. Make sure that client request mode is enabled in your LobeChat configuration to fix the issue [1]. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
The "fetch failed" TypeError when using Ollama integration with Lobe Chat in a Docker environment on Windows 11 can be resolved by enabling the client request mode in LobeChat. When client request mode is disabled, the Ollama model configuration will fail with this error. Ensure that client request mode is enabled in your LobeChat configuration to fix the issue [1]. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
@dosu what about this error? |
The error message indicates that function calling is not enabled for the specified Google Generative AI model. To enable function calling for Google Generative AI models in Lobe Chat, ensure that the model's description includes 'function calling' or 'tools'. This is checked in the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
This issue is closed, If you have any questions, you can comment and reply. |
🎉 This issue has been resolved in version 1.49.11 🎉 The release is available on: Your semantic-release bot 📦🚀 |
📦 Environment
Docker
📌 Version
Latest Docker, Windows 11
💻 Operating System
Windows
🌐 Browser
Other
🐛 Bug Description
When I try to use Lobe Chat with oLLama, it tells me "{
"error": {
"message": "fetch failed",
"name": "TypeError"
},
"provider": "ollama"
}"
Why does it do that?
📷 Recurrence Steps
I start ollama and start the lobe chat server. Then I navigate to the local server and try to chat.
🚦 Expected Behavior
I thought it would respond to my prompt of "What is lobechat?"
📝 Additional Information
No response
The text was updated successfully, but these errors were encountered: