Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. Announcements
  3. Ollama is now available

Ollama is now available

Scheduled Pinned Locked Moved Announcements
13 Posts 7 Posters 426 Views 7 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • nebulonN Offline
    nebulonN Offline
    nebulon
    Staff
    wrote last edited by
    #3

    Yes that was the plan, we will remove it from OpenWebUI once the standalone package gets stable.

    1 Reply Last reply
    1
    • A Offline
      A Offline
      abuyuy
      wrote last edited by
      #4

      Is it able to use graphics card?

      1 Reply Last reply
      2
      • jamesJ Online
        jamesJ Online
        james
        Staff
        wrote last edited by
        #5

        Hello @abuyuy
        Currently, the Ollama app does not include the capability to access vaapi devices (gpus).
        We can add this capability to the Ollama app but how to configure it needs to be researched then.

        R 1 Reply Last reply
        2
        • A Offline
          A Offline
          abuyuy
          wrote last edited by
          #6

          Thanks James for the quick answer.
          I think having GPU support for Ollama would be tremendously helpful for the performance and usability of the app. 🙂

          1 Reply Last reply
          0
          • nebulonN Offline
            nebulonN Offline
            nebulon
            Staff
            wrote last edited by
            #7

            @abuyuy yes fully agree, we will first focus on getting the app to a stable package state and then investigate what all is required for hardware support.

            1 Reply Last reply
            1
            • E Offline
              E Offline
              ekevu123
              wrote last edited by
              #8

              How do you intend to get the GPU to the server? With most public hosting plans, this is quite expensive? I guess that is only relevant for people using their own hardware?

              1 Reply Last reply
              2
              • firmansiF Offline
                firmansiF Offline
                firmansi
                wrote last edited by
                #9

                Eventhough it's out of context of this thread but i see that the Cloudron team already start the detachment process of Ollama and OpenWebui

                1 Reply Last reply
                0
                • R Offline
                  R Offline
                  robw
                  wrote last edited by
                  #10

                  Firstly, this is wonderful news. I see that your Ollama package appears to include an authentication capability, which is something that vanilla Ollama is missing. When coupled with Cloudron's app management features, that makes it extremely helpful.

                  However... Without GPU support or crazy fast CPUs and a big load of fast RAM, isn't Ollama self hosting almost useless?

                  Note - Here's an old thread where we discussed some related topics.

                  J 1 Reply Last reply
                  1
                  • jamesJ james

                    Hello @abuyuy
                    Currently, the Ollama app does not include the capability to access vaapi devices (gpus).
                    We can add this capability to the Ollama app but how to configure it needs to be researched then.

                    R Offline
                    R Offline
                    robw
                    wrote last edited by
                    #11

                    @james In our case (we're have some old enterprise Nvidia GPUs), we found that installing the Nvidia drivers and CUDA toolkit is ultimately not difficult once you know what to do. Our problem with making that useful on Cloudron (per this old thread) was that we couldn't start Cloudron's OpenWebUI package with GPU support. So even though we had our GPU hardware working on a Cloudron server, we could never use it with a Cloudron instance of OpenWebUI/Ollama.

                    I suspect the process of installing the Nvidia drivers and CUDA toolkit is generic enough that it could be supported by Cloudron somehow. However we don't have any GPUs from other enterprise brands (e.g. AMD, Intel) so we never tried that, and I imagine supporting drivers for consumer GPUs is an impossible maze of complexity.

                    We would LOVE it if you could support the Nvidia GPU use case as a basic option.

                    1 Reply Last reply
                    1
                    • R robw

                      Firstly, this is wonderful news. I see that your Ollama package appears to include an authentication capability, which is something that vanilla Ollama is missing. When coupled with Cloudron's app management features, that makes it extremely helpful.

                      However... Without GPU support or crazy fast CPUs and a big load of fast RAM, isn't Ollama self hosting almost useless?

                      Note - Here's an old thread where we discussed some related topics.

                      J Online
                      J Online
                      joseph
                      Staff
                      wrote last edited by
                      #12

                      @robw this package is simply the first step. second step is to enable GPU/VAAPI support in docker (this requires some complex automation). @james is looking into this.

                      1 Reply Last reply
                      1
                      • nebulonN Offline
                        nebulonN Offline
                        nebulon
                        Staff
                        wrote last edited by
                        #13

                        The package is now marked as stable after some internal reorganization, will lock this thread in favor of the dedicated forum section.

                        1 Reply Last reply
                        1
                        • nebulonN nebulon locked this topic
                        Reply
                        • Reply as topic
                        Log in to reply
                        • Oldest to Newest
                        • Newest to Oldest
                        • Most Votes


                        • Login

                        • Don't have an account? Register

                        • Login or register to search.
                        • First post
                          Last post
                        0
                        • Categories
                        • Recent
                        • Tags
                        • Popular
                        • Bookmarks
                        • Search