Eventhough it's out of context of this thread but i see that the Cloudron team already start the detachment process of Ollama and OpenWebui
firmansi
Posts
-
Ollama is now available -
Ollama is now availableFinally...hopefully it can be detached with openwebui in next update
-
Connecting Cloudron Nextcloud to Cloudr CollaboraMy comment to this thread maybe out of context, but as your information, I have tried for the last 1 year till today, Collabora as Nextcloud Office with Nextcloud Community Edition ,since I have no idea how good Collabora running on NC Enterprise Edition, is not reliable, i've been using onlyoffice and seems onlyoffice is much mature in stability and reliability especially running with NC Community Edition
-
Nextcloud update pushed too early?So far I have no issues as mentioned above with latest update, my stack running with onlyoffice, nextcloud assistant, team folder, draw.io, groupware. My NC running with more than 100 users with Cloudron
-
Accessing Nextcloud App Store following 32.0.0 update@osobo I don't think it's possible to run this with Cloudron Docker unless you run it in dedicated machine without docker. It simply can't install docker on top of docker
-
Update OpenWebUI without Ollama@girish ya, i think this is a good approach because in real use case, I think most people prefer to run inference models not in the same machine with openWebUI instance. OpenWebUI is good enough bundled with chromadb , sql lite or postgressql database and redis as one package and the inference models (including ollama) running in different servers.
-
Update OpenWebUI without OllamaActually it is possible to have Open Webui without Ollama, but it means that there's gonna be two version of OpenWebui in Cloudron, one with Ollama and one without Ollama, and I am not sure this technique will be favourable by Cloudron team since it means the team should maintain two versions
-
Openwebui V0.6.30I see that the latest update of OpenWebUI already in pipeline but not deployed yet , is there any issues in mainstream for this latest update?
-
Anyone successfully installed a fileserver MCP?@Divemasterza What do you mean correct the OWUI? and wha kind of app that you use to host your MCP tools?
-
Update OpenWebUI without Ollama@james I have done this
-
Update OpenWebUI without Ollama@nebulon Thanks for the explanation
-
Update OpenWebUI without OllamaIs it possible to skip updating Ollama if there's an openwebui update since we don't use Ollama at all, but it takes so much storage and time and we simply can't skip
So it's like we have to pay for extra storage in Cloud for thing that we don't use at all.
-
Anyone successfully installed a fileserver MCP?@Divemasterza Are you using this method with existing openwebui in cloudron instance?
-
How to create an admin user?@luckym only for user created through the create account form when you first time set up the Librechat, and you need to follow the sequences as instructed in Cloudron as checklist, otherwise , you might encounter issues in database
-
Ghost is joining the Fediverse and adding ActivityPubI am actually disappointed with latest major release of Ghost, it's not even better than previous one, not just for the Activitypub which I experience issues like discussed in this thread, also in this last version, if we don't have Tinybird installed, we can't use the analytics properly, previously i can see those who clicks our newsletter, but now we can only see numbers of click without knowing which members. I really hope someone from Ghost blog can see this thread and see how the latest update is a disappointment to some loyal users
-
How to create an admin user?When the instance in Cloudron is ready , the first user you create for Librechat will act as Admin
-
3.93.1 was the last stable release of n8n (fixed in 3.97.0)@umnz still pre release?
-
Integrated Redis@james yes, it works like a charm
-
Integrated Redis@james Hai, that's not what I mean, what i mean is https://docs.openwebui.com/tutorials/integrations/redis/
The configuration for docker as far as I know
version: '3.8' services: redis: image: redis:latest container_name: redis ports: - "6379:6379" command: ["redis-server", "--requirepass", ""] # Opsional: tambahkan password open-webui: image: ghcr.io/open-webui/open-webui:main container_name: open-webui ports: - "8080:8080" environment: - REDIS_URL=redis://redis:6379 - WEBSOCKET_REDIS_URL=redis://redis:6379 depends_on: - redis volumes: - ./data:/app/backend/dataI don't know how to do that in Cloudron
-
Integrated Redis@girish I truly hope that in the next update, the Cloudron team can integrate Redis into the OpenWebUI instance—especially since Redis appears to already be available within Cloudron. Redis is crucial for enabling real-time communication between client applications and OpenWebUI. Without this feature enabled, OpenWebUI becomes difficult to use for business purposes, particularly when there are many users. The absence of Redis often leads to frequent disconnections in streaming and real-time communication from the client application to the server (OpenWebUI), as well as issues when loading models from inference in the admin panel