Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. Announcements
  3. Ollama is now available

Ollama is now available

Scheduled Pinned Locked Moved Announcements
7 Posts 4 Posters 141 Views 4 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • nebulonN Offline
    nebulonN Offline
    nebulon
    Staff
    wrote last edited by
    #1

    Ollama is now packaged as a standalone app.

    The app package is as usual still published as unstable until we get some feedback how it works and iron out any initial issues.

    Forum category is at https://forum.cloudron.io/category/212/ollama so please report any issues there
    The docs will be at https://docs.cloudron.io/packages/ollama/
    Package repo is https://git.cloudron.io/packages/ollama-app

    1 Reply Last reply
    9
    • firmansiF Offline
      firmansiF Offline
      firmansi
      wrote last edited by
      #2

      Finally...hopefully it can be detached with openwebui in next update

      1 Reply Last reply
      0
      • nebulonN Offline
        nebulonN Offline
        nebulon
        Staff
        wrote last edited by
        #3

        Yes that was the plan, we will remove it from OpenWebUI once the standalone package gets stable.

        1 Reply Last reply
        0
        • A Offline
          A Offline
          abuyuy
          wrote last edited by
          #4

          Is it able to use graphics card?

          1 Reply Last reply
          1
          • jamesJ Offline
            jamesJ Offline
            james
            Staff
            wrote last edited by
            #5

            Hello @abuyuy
            Currently, the Ollama app does not include the capability to access vaapi devices (gpus).
            We can add this capability to the Ollama app but how to configure it needs to be researched then.

            1 Reply Last reply
            1
            • A Offline
              A Offline
              abuyuy
              wrote last edited by
              #6

              Thanks James for the quick answer.
              I think having GPU support for Ollama would be tremendously helpful for the performance and usability of the app. 🙂

              1 Reply Last reply
              0
              • nebulonN Offline
                nebulonN Offline
                nebulon
                Staff
                wrote last edited by
                #7

                @abuyuy yes fully agree, we will first focus on getting the app to a stable package state and then investigate what all is required for hardware support.

                1 Reply Last reply
                0
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Don't have an account? Register

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • Bookmarks
                • Search