Update OpenWebUI without Ollama
-
Is it possible to skip updating Ollama if there's an openwebui update since we don't use Ollama at all, but it takes so much storage and time and we simply can't skip
So it's like we have to pay for extra storage in Cloud for thing that we don't use at all.
-
Since ollama is part of the app package, an update to it will always trigger a new package version. To achieve independent updates, we would have to separate ollama from openwebui, but that makes the integration more error prone and requires manual steps. I guess for your case you can change to manual updates and when an update happens, skip the backup creation. However those few extra backups should be purged depending on your backup retention over time.
-
Since ollama is part of the app package, an update to it will always trigger a new package version. To achieve independent updates, we would have to separate ollama from openwebui, but that makes the integration more error prone and requires manual steps. I guess for your case you can change to manual updates and when an update happens, skip the backup creation. However those few extra backups should be purged depending on your backup retention over time.
-
Hello @firmansi
Is this doc not what you are looking for?
https://docs.cloudron.io/packages/openwebui/#ollama -
Actually it is possible to have Open Webui without Ollama, but it means that there's gonna be two version of OpenWebui in Cloudron, one with Ollama and one without Ollama, and I am not sure this technique will be favourable by Cloudron team since it means the team should maintain two versions
-
@firmansi I think rather we will remove ollama from the current open web ui package altogether . That way the ollama instance can be shared by the apps. Also, maybe people have custom apps now which would benefit from this.
@girish ya, i think this is a good approach because in real use case, I think most people prefer to run inference models not in the same machine with openWebUI instance. OpenWebUI is good enough bundled with chromadb , sql lite or postgressql database and redis as one package and the inference models (including ollama) running in different servers.