Is the app still unstable?
-
The Open Web UI project initially started out as a Ollama frontend. To run local models, one needs GPU support and Cloudron currently does not have GPU support. Over time, Open Web UI has shifed to be a frontend for OpenAI API compatible services . It is also now possible to configure Ollama in another computer which has a GPU and then point Open Web UI to use that Ollama.
I guess we can mark this as stable since it's still useful while we figure out GPU support in Cloudron.
Hello! It looks like you're interested in this conversation, but you don't have an account yet.
Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.
With your input, this post could be even better 💗
Register Login
