-
-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bug: hanging connection causing blank screen #2337
Comments
PR welcome! |
Updated some code on our dev branch, let us know if that did anything for you. |
Hey @tjbck, I got around to testing the dev branch today. The problem still seems to happen. I should have said to use |
having the same exact issue here but Im not using docker
and I also have these vars set ( ollama is running on gpu mode ) and the os is arch linux with lts kernel |
@Hiradpi I'm going to guess this is on Windows. You'll need to set |
this worked thanks a lot I added ENABLE_LITELLM=False to the .env file and it solved the problem thanks a lot <3 |
For clarity this is unrelated to the issue originally described. I just tested it again on the latest dev branch, and the UI is still blank when the Ollama service is reachable but unresponsive (such as when using netcat instead of ollama to listen to the port). |
Just pushed a massive refac/update to our dev branch, please try again and let us know how it went! Much thanks! |
Now the UI will be blank for 5 seconds, and then it will appear after the backend connection to Ollama times out. The UI appears after the models connection finishes. It is definitely much better than indefinitely showing a blank screen (enough that I would consider the original issue resolved), but I do think the UI should be able to show before that request completes. Feel free to close this issue, or use it to continue tracking the UI not appearing until this request finishes at your discretion. |
@Zambito1 could you try disabling Ollama connection from settings > connections? |
Hm, that menu actually has some weird behavior when I try to do that. When I navigate there while listening with netcat instead of Ollama, the UI will show Ollama and Open AI as disabled. When the connection attempt to Ollama times out, the UI will change automatically, switching both to be enabled. I did not interact with anything manually between the above screenshots. If I have nothing listening on the port that Open WebUI expects Ollama to be on (neither netcat nor Ollama), these are both immediately populated and on when I navigate to the menu. Note that I did not manually enter the URL between the above two screenshots. I just closed netcat, and then reopened the settings window. |
Just added a fix to include a loading screen, let me know if you encounter the same set of issues! |
On commit b7fc37d the settings will load for 12 seconds before timing out (2 sequential 6 second time outs). After the connection times out, the settings will show with the URLs populated and the connections enabled. If I disable the Ollama connection now, I am able to refresh the page without waiting on a blank screen like my earlier tests today. |
Bug Report
Description
Bug Summary:
If the Open WebUI backend hangs indefinitely, the UI will show a blank screen with just the keybinding help button in the bottom right.
Steps to Reproduce:
I noticed this because I run Open WebUI on my desktop, and Ollama on another machine. If I connected to a VPN on my desktop, LAN connections from my desktop would hang indefinitely. When I tried to boot up Open WebUI, I would just see the screen above.
The easiest way to reproduce this is to run
nc -lp 11434
and try to use that as the Ollama server. You will see something like:As the Open WebUI backend tries to connect to Ollama, but
nc
will never respond to the request, nor will it close the connection. You will see the above screen when you open the web interface.I was able to work around my VPN issue by splitting my connection to only route traffic that I want to access over the VPN with the virtual interface.
Expected Behavior:
You can see the web interface, and ideally there will be some sort of timeout to show a connection error.
Actual Behavior:
You are hit with a wall of nothing.
Environment
Open WebUI Version: v0.1.124
Ollama (if applicable): NA
Operating System: NA
Browser (if applicable): NA
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
NA, the connection issue is on the backend. There is nothing that fails in the UI logs.
Docker Container Logs:
This is what happens when you load the index page:
If you kill
nc
, the UI will actually show up:Screenshots (if applicable):
See above.
Installation Method
Docker, but probably does not matter.
The text was updated successfully, but these errors were encountered: