Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] 本地OLLAMA連線異常 #5634

Open
tcyuan1017 opened this issue Jan 30, 2025 · 4 comments
Open

[Bug] 本地OLLAMA連線異常 #5634

tcyuan1017 opened this issue Jan 30, 2025 · 4 comments
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models

Comments

@tcyuan1017
Copy link

📦 Environment

Docker

📌 Version

20250130

💻 Operating System

Ubuntu

🌐 Browser

Chrome

🐛 Bug Description

Image
Image
不管接口代理地址如何調整,連通信檢查都失敗,一直呈現"host": "http://127.0.0.1:11434",
無法連線本地OLLAMA。

📷 Recurrence Steps

No response

🚦 Expected Behavior

No response

📝 Additional Information

No response

@tcyuan1017 tcyuan1017 added the 🐛 Bug Something isn't working | 缺陷 label Jan 30, 2025
@github-project-automation github-project-automation bot moved this to Roadmap - Chat 1.x in Lobe Chat Routine Jan 30, 2025
@lobehubbot
Copy link
Member

👀 @tcyuan1017

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@dosubot dosubot bot added the ollama Relative to Ollama Provider and ollama models label Jan 30, 2025
Copy link

dosubot bot commented Jan 30, 2025

@tcyuan1017 你好!我是Dosu,一個可以幫助您解決錯誤、回答問題並成為貢獻者的機器人。我將在等待人類維護者的過程中幫助您。

I found a similar issue that was resolved, which might help you with the connectivity problem you're experiencing with the Ollama service in LobeChat.

The solution involves configuring Ollama for cross-origin access by setting the OLLAMA_ORIGINS environment variable to allow access from any origin. Here are the steps you can follow:

On Linux:

  1. Edit the systemd service:
    sudo systemctl edit ollama.service
  2. Add the following under [Service]:
    [Service]
    Environment="OLLAMA_HOST=0.0.0.0"
    Environment="OLLAMA_ORIGINS=*"
  3. Save and exit.
  4. Reload systemd and restart Ollama:
    sudo systemctl daemon-reload
    sudo systemctl restart ollama

Using Docker:

  1. Run the Docker container with the environment variable:
    docker run -d --gpus=all -v ollama:/root/.ollama -e OLLAMA_ORIGINS="*" -p 11434:11434 --name ollama ollama/ollama

Additionally, ensure that the DEFAULT_BASE_URL in your LobeChat configuration is set to http://127.0.0.1:11434, as this is the default host used by the Ollama service. If you have customized the base URL, make sure it matches the configuration in your Ollama service setup [1].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@tcyuan1017
Copy link
Author

Dosu,
我用Docker compose跑Ollama服務,
也有加OLLAMA_ORIGINS="*"參數,如圖片,
Image
重點是,我要怎麼調整連通性檢查的網址?
看起來接口代理地址設定不起作用。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


DOSU,
I use Docker Compose to run Ollama service,
There are also parameters of Ollama_origins = "*", such as pictures,
! [Image] (https://github.com/user-attachments/831d5573-3C2D-4770-836e-1A045D6de)))
The point is, how do I adjust the URL of the connected inspection?
It seems that the interface proxy address does not work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

2 participants