Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streamlit time runout error #35

Open
HrithikgowdaD opened this issue Dec 11, 2024 · 1 comment
Open

Streamlit time runout error #35

HrithikgowdaD opened this issue Dec 11, 2024 · 1 comment

Comments

@HrithikgowdaD
Copy link

2024-12-11 13:15:24.381 WARNING streamlit.runtime.scriptrunner_utils.script_run_context: Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.240 WARNING streamlit.runtime.scriptrunner_utils.script_run_context: Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.258
Warning: to view this Streamlit app on a browser, run it with the following
command:

streamlit run d:\Project\Local-Multimodal-AI-Chat-main\app.py [ARGUMENTS]

2024-12-11 13:16:02.260 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.262 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.262 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.263 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.263 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.264 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.271 Session state does not function when running a script without streamlit run
2024-12-11 13:16:02.272 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.274 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.274 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.274 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.275 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.276 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.281 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.291 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.292 Thread 'MainThread': missing ScriptRunContex2024-12-11 13:16:02.291 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
2024-12-11 13:16:02.292 Thread 'MainThread': missing ScriptRunContext! This warning can be ignored when running in bare mode.
Traceback (most recent call last):
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\urllib3\connection.py", line 199, in _new_conn
sock = connection.create_connection(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\urllib3\util\connection.py", line 85, in create_connection
raise err
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\urllib3\util\connection.py", line 73, in create_connection
sock.connect(sa)
TimeoutError: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\urllib3\connectionpool.py", line 789, in urlopen
response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\urllib3\connectionpool.py", line 495, in _make_request
conn.request(
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\urllib3\connection.py", line 441, in request
self.endheaders()
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.11_3.11.2544.0_x64__qbz5n2kfra8p0\Lib\http\client.py", line 1298, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.11_3.11.2544.0_x64__qbz5n2kfra8p0\Lib\http\client.py", line 1058, in _send_output
self.send(msg)
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.11_3.11.2544.0_x64__qbz5n2kfra8p0\Lib\http\client.py", line 996, in send
self.connect()
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\urllib3\connection.py", line 279, in connect
self.sock = self._new_conn()
^^^^^^^^^^^^^^^^
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\urllib3\connection.py", line 208, in _new_conn
raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPConnection object at 0x0000018A13092110>, 'Connection to host.docker.internal timed out. (connect timeout=None)')

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\requests\adapters.py", line 667, in send
resp = conn.urlopen(
^^^^^^^^^^^^^
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\urllib3\connectionpool.py", line 843, in urlopen
retries = retries.increment(
^^^^^^^^^^^^^^^^^^
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\urllib3\util\retry.py", line 519, in increment
raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/tags (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x0000018A13092110>, 'Connection to host.docker.internal timed out. (connect timeout=None)'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "d:\Project\Local-Multimodal-AI-Chat-main\app.py", line 179, in
main()
File "d:\Project\Local-Multimodal-AI-Chat-main\app.py", line 79, in main
st.session_state.model_options = list_model_options()

^^^^^^^^^^^^^^^^^^^^
File "d:\Project\Local-Multimodal-AI-Chat-main\app.py", line 56, in list_model_options
ollama_options = list_ollama_models()
^^^^^^^^^^^^^^^^^^^^
File "d:\Project\Local-Multimodal-AI-Chat-main\utils.py", line 115, in list_ollama_models
json_response = requests.get(url = config["ollama"]["base_url"] + "/api/tags").json()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\requests\api.py", line 73, in get
return request("get", url, params=params, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\requests\api.py", line 59, in request
return session.request(method=method, url=url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Project\Local-Multimodal-AI-Chat-main\chat_venv\Lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
etries exceeded with url: /api/tags (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x0000018A13092110>, 'Connection to host.docker.internal timed out. (connect timeout=None)'))

So can i get to know how to fix this error

@Leon-Sander
Copy link
Owner

did you install ollama as a docker container?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants