@james I have done this
firmansi
Posts
-
Update OpenWebUI without Ollama -
Update OpenWebUI without Ollama@nebulon Thanks for the explanation
-
Update OpenWebUI without OllamaIs it possible to skip updating Ollama if there's an openwebui update since we don't use Ollama at all, but it takes so much storage and time and we simply can't skip
So it's like we have to pay for extra storage in Cloud for thing that we don't use at all.
-
Anyone successfully installed a fileserver MCP?@Divemasterza Are you using this method with existing openwebui in cloudron instance?
-
How to create an admin user?@luckym only for user created through the create account form when you first time set up the Librechat, and you need to follow the sequences as instructed in Cloudron as checklist, otherwise , you might encounter issues in database
-
Ghost is joining the Fediverse and adding ActivityPubI am actually disappointed with latest major release of Ghost, it's not even better than previous one, not just for the Activitypub which I experience issues like discussed in this thread, also in this last version, if we don't have Tinybird installed, we can't use the analytics properly, previously i can see those who clicks our newsletter, but now we can only see numbers of click without knowing which members. I really hope someone from Ghost blog can see this thread and see how the latest update is a disappointment to some loyal users
-
How to create an admin user?When the instance in Cloudron is ready , the first user you create for Librechat will act as Admin
-
3.93.1 was the last stable release of n8n (fixed in 3.97.0)@umnz still pre release?
-
Integrated Redis@james yes, it works like a charm
-
Integrated Redis@james Hai, that's not what I mean, what i mean is https://docs.openwebui.com/tutorials/integrations/redis/
The configuration for docker as far as I know
version: '3.8' services: redis: image: redis:latest container_name: redis ports: - "6379:6379" command: ["redis-server", "--requirepass", ""] # Opsional: tambahkan password open-webui: image: ghcr.io/open-webui/open-webui:main container_name: open-webui ports: - "8080:8080" environment: - REDIS_URL=redis://redis:6379 - WEBSOCKET_REDIS_URL=redis://redis:6379 depends_on: - redis volumes: - ./data:/app/backend/data
I don't know how to do that in Cloudron
-
Integrated Redis@girish I truly hope that in the next update, the Cloudron team can integrate Redis into the OpenWebUI instance—especially since Redis appears to already be available within Cloudron. Redis is crucial for enabling real-time communication between client applications and OpenWebUI. Without this feature enabled, OpenWebUI becomes difficult to use for business purposes, particularly when there are many users. The absence of Redis often leads to frequent disconnections in streaming and real-time communication from the client application to the server (OpenWebUI), as well as issues when loading models from inference in the admin panel
-
Libcrechat PostgreSQL / pgvector@girish yes. this will make more like openwebui
-
Integrated RedisRecent OpenWebUI already included with redis management, is there any plan for Cloudron team to have redis activated along with OpenWebUI instance?
-
Tinybird Integration@joseph yes, that's what i think
-
Tinybird Integration@joseph Is it possible with Cloudron at the moment?
-
Tinybird Integration@jrl-abstract27 it doesnt work for me too
-
Tinybird IntegrationTinybird is a web analytics of ghost 6 , it's provided automatically through ghost pro in the cloud, but for the self hosted version, it must be wrapped with the docker package
-
Tinybird IntegrationI see that the upcoming major release of Ghost coming in Cloudron Git, and I hope this update will include tinybird integration as this one acts as native analytics for Ghost
-
LibreChat - Package Updates@nebulon If I might know, is the new update already bundled with RAG API?
-
Anyone successfully installed a fileserver MCP?@Divemasterza This also not supported on Cloudron, I have asked before to cloudron team to have the packaged installed with mcp proxy, but still not ready yet, the solution is to use another mcp server hosted in other server like https://github.com/metatool-ai/metamcp