Is local Ollama installation updatable by Cloudron?
-
Hi, as discussed here and here: using local GPU (if available) with Ollama is critical for getting performance.
Therefore: Is it possible to install Ollama locally on the server (outside of Cloudron for direct GPU Support) and then install the Ollama Cloudron package to continuously update Ollama? Or is there any other way to automatically keep Ollama up to date?
I love the professional level Cloudron reached!

Hello! It looks like you're interested in this conversation, but you don't have an account yet.
Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.
With your input, this post could be even better 💗
Register Login