Is the app still unstable?
-
The Open Web UI project initially started out as a Ollama frontend. To run local models, one needs GPU support and Cloudron currently does not have GPU support. Over time, Open Web UI has shifed to be a frontend for OpenAI API compatible services . It is also now possible to configure Ollama in another computer which has a GPU and then point Open Web UI to use that Ollama.
I guess we can mark this as stable since it's still useful while we figure out GPU support in Cloudron.