Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps - Status | Demo | Docs | Install
  1. Cloudron Forum
  2. Ollama
  3. Is local Ollama installation updatable by Cloudron?

Is local Ollama installation updatable by Cloudron?

Scheduled Pinned Locked Moved Ollama
gpulocalaisupport
3 Posts 2 Posters 47 Views 2 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • nottheendN Offline
    nottheendN Offline
    nottheend
    wrote last edited by nottheend
    #1

    Hi, as discussed here and here: using local GPU (if available) with Ollama is critical for getting performance.

    Therefore: Is it possible to install Ollama locally on the server (outside of Cloudron for direct GPU Support) and then install the Ollama Cloudron package to continuously update Ollama? Or is there any other way to automatically keep Ollama up to date?

    I love the professional level Cloudron reached! 🙂

    1 Reply Last reply
    1
    • nebulonN Offline
      nebulonN Offline
      nebulon
      Staff
      wrote last edited by
      #2

      The main issue with ollama and GPU access is, that nvidia requires a patched docker and we rely on the official ubuntu docker packages. We have to revisit this from time to time though as the space if evolving fast.

      Also we only looked mostly into nvidia but for AMD and Intel, this might just work already by forwarding the corresponding /dev/dri device to the container, as they don't need a patched docker. However the struggle here was that we do not have access to such devices to test any of this. Most VPS provider only have nvidia chips to rent.

      The alternative to run ollama on the host system should work fine actually, however then you have to manage ollama locally on your own, Cloudron cannot update this then also you are on your own about the system updates and a Cloudron platform update might break your system also, since we cannot test for such setups either. But if this is not a production server, this is probably fine.

      1 Reply Last reply
      2
      • nottheendN Offline
        nottheendN Offline
        nottheend
        wrote last edited by nottheend
        #3

        Thank you for the detailed response.

        I appreciate that you follow the principles which makes a longer life of the cloudron product more probable 🫶

        When I read the docs, I understand it is not advised to change any firewall configuration on the server itself. However: How can I access from then Open WebUI App the Ollama instance, running on the host on the relevant port?

        I already configured it to bind to 0.0.0.0 for that port, but it seems to be a restriction from the Open WebUI App Container itself.

        As you said, it might not be advised for a Prod-Instance. But what would be the least not advised configuration?

        1 Reply Last reply
        0

        Hello! It looks like you're interested in this conversation, but you don't have an account yet.

        Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.

        With your input, this post could be even better 💗

        Register Login
        Reply
        • Reply as topic
        Log in to reply
        • Oldest to Newest
        • Newest to Oldest
        • Most Votes


        • Login

        • Don't have an account? Register

        • Login or register to search.
        • First post
          Last post
        0
        • Categories
        • Recent
        • Tags
        • Popular
        • Bookmarks
        • Search