Launch OpebWebUI with GPU support
-
Per this thread, I think it's possible to add Nvidia GPU support to a Cloudron server without impacting Cloudron (or at least not breaking it).
So I'd like to be able to launch OpenWebUI with GPU support.
I think that adding the
--gpus=all
switch and the:cuda
tag to the image name to the startup options might be all that's required. (Assuming the Ubuntu host has the Nvidia driver and CUDA and the Docker CUDA toolkit properly installed, I think it'll work, and if not, I believe it'll fall back to using CPU.)Hard coding this would be enough for us, and I don't think it would break for anyone who doesn't have a GPU. The only people it might not suit are people who have a GPU running but don't want to use it. So adding a configuration switch of some kind to turn it on or off would be even nicer.
-
is the docker CUDA toolkit different to the nvidia container toolkit ?
https://github.com/NVIDIA/nvidia-container-toolkit -
That's a good question, since other somewhat recent guides refer to
nvidia-docker2
which I know is still installable in current standard Ubuntu because I installed it the other day.@AartJansen I believe nvidia-container-toolkit you linked is the current supported version of the same thing considering NVIDIA's current installation guide and the deprecation notice on nvidia-docker. This is what I currently have installed.
I would be happy to test/verify and report back about that, but at the moment I can't launch OpenWebUI with GPU support on Cloudron so I can't absolutely verify with an end-to-end test.
To be clear though, my current feature request is not for Cloudron to install the Nvidia drivers and CUDA and container support. Although it would be wonderful for Cloudron to handle all that or some of that - and it's worthy of a separate feature request - I wasn't (yet) asking for Cloudron to handle the proprietary stuff. I was only asking for the GPU support launch switches to be added to the existing OpenWebUI Cloudron container. My plan was to keep the request as simple as possible for the moment, in the hope it might get into a release sooner rather than later.
-
Yeah I get that, I don't have much understanding of docker, and don't really know what the container tookit does to existing dockers, my only interest is using an nvidia GPU to transcode in jellyfin. My problem with the installation guide is not finding the file /etc/docker/daemon.json in step 1 of configuration. I have not run the command because I doubt it will succeed if the file it modifies doesn't exist, even if it creates it, its probably not going to affect the existing cloudron docker apps.
-
I am not a docker or Linux expert (only operational knowledge), but my GPU is running on the Cloudron server (a virtual machine in my case). Happy to share insights outside of this thread if you'd like (although not sure I can help), but I don't believe this is related to the feature request so to avoid confusing everyone, I don't think we should discuss it here.
-