@Lanhild said in ETA for GPU support? Could we contribute to help it along?:
A lot of companies that might deploy Cloudron for its ease of life features don't necessarily have a VPS with a GPU.
Also, (might help you to deepen your Cloudron knowledge) Cloudron packages usually are only one component/application.
Moreover, OpenWebUI is "just" a UI that supports connections to Ollama and isn't affiliated with it. Meaning that Ollama isn't a dependency of it at all.
Excellent points @Lanhild - you've convinced me. 🙂
And there are benefits on the Ollama side too. I would appreciate the benefit in using Cloudron to keep our Ollama installation automatically up to date on its own, for instance.
In fact, given our remaining inability to modify the existing Cloudron OpenWebUI app to run with our GPUs, for our small clients we are now thinking this way - I.e. using Cloudron just for the OpenWebUI component and letting them connect to our separately hosted Ollama. It's a bit less convenient than we were hoping, but at least we'll still have segregated data and user management for each client in OpenWebUI.
So now, I also want a Cloudron OpenWebUI app that does not come with bundled Ollama, so that I can be sure these customers don't hammer our CPUs and get frustrated by a slow user experiences. 🙂