Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. OpenWebUI
  3. Ollama: permissions issue when using volume storage

Ollama: permissions issue when using volume storage

Scheduled Pinned Locked Moved OpenWebUI
16 Posts 4 Posters 1.2k Views 4 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • BrutalBirdieB Offline
    BrutalBirdieB Offline
    BrutalBirdie
    Partner
    wrote on last edited by
    #2

    What are you using for the Volume?
    I am using a sub-account of my Hetzner Storage box with CIFS and I know that some apps do not like this at all.
    Some just work and some refuse to work.
    If I remember correctly I also had issues with permissions in my case. 🤔
    But that is some time ago.

    Like my work? Consider donating a drink. Cheers!

    1 Reply Last reply
    1
    • J Offline
      J Offline
      joseph
      Staff
      wrote on last edited by
      #3

      You have to run ollama server as cloudron user. Like gosu cloudron ollama serve

      1 Reply Last reply
      1
      • N Offline
        N Offline
        ntnsndr
        wrote on last edited by
        #4

        @BrutalBirdie Thanks for sharing your experience—I'm using just the Filesystem mode in hopes that it would be most compatible.

        @joseph Shouldn't the app automatically run ollama rather than requiring me to do it? And why wouldn't it work on the filesystem volume even when it is set up on chmod 777?

        Thanks all!

        1 Reply Last reply
        0
        • J Offline
          J Offline
          joseph
          Staff
          wrote on last edited by
          #5

          @ntnsndr the supervisor config file in the package sets additional env vars like HOME and NTLK_DATA . Have you tried restarting the app (which ends up restarting supervisor) ?

          N 1 Reply Last reply
          0
          • J joseph

            @ntnsndr the supervisor config file in the package sets additional env vars like HOME and NTLK_DATA . Have you tried restarting the app (which ends up restarting supervisor) ?

            N Offline
            N Offline
            ntnsndr
            wrote on last edited by
            #6

            @joseph Yep, many times! Any other ideas?

            1 Reply Last reply
            0
            • J Offline
              J Offline
              joseph
              Staff
              wrote on last edited by
              #7

              @ntnsndr think we are missing something.

              • why are you trying to run ollama serve manually on web terminal?
              • when you start via supervisor, you still get same error as readonlyfs in /root/.ollama ?
              1 Reply Last reply
              0
              • N Offline
                N Offline
                ntnsndr
                wrote on last edited by
                #8

                @joseph I was only try to run ollama manually to understand why it is not starting on its own when the app boots.

                So, when I reboot the app with the /media/ollama-vol directory at 777 permissions, here is what I get:

                Apr 15 08:31:05 2025-04-15 14:31:05,744 CRIT Supervisor is running as root. Privileges were not dropped because no user is specified in the config file. If you intend to run as root, you can set user=root in the config file to avoid this message.
                Apr 15 08:31:05 2025-04-15 14:31:05,744 INFO Included extra file "/etc/supervisor/conf.d/ollama.conf" during parsing
                Apr 15 08:31:05 2025-04-15 14:31:05,744 INFO Included extra file "/etc/supervisor/conf.d/openwebui.conf" during parsing
                Apr 15 08:31:05 2025-04-15 14:31:05,748 INFO RPC interface 'supervisor' initialized
                Apr 15 08:31:05 2025-04-15 14:31:05,748 CRIT Server 'unix_http_server' running without any HTTP authentication checking
                Apr 15 08:31:05 2025-04-15 14:31:05,749 INFO supervisord started with pid 1
                Apr 15 08:31:06 2025-04-15 14:31:06,752 INFO spawned: 'ollama' with pid 25
                Apr 15 08:31:06 2025-04-15 14:31:06,755 INFO spawned: 'openwebui' with pid 26
                Apr 15 08:31:06 2025/04/15 14:31:06 routes.go:1231: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:2048 OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/media/ollama-vol/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
                Apr 15 08:31:06 Error: mkdir /media/ollama-vol/models/blobs: permission denied
                Apr 15 08:31:06 2025-04-15 14:31:06,790 WARN exited: ollama (exit status 1; not expected)
                Apr 15 08:31:07 2025-04-15 14:31:07,792 INFO spawned: 'ollama' with pid 33
                Apr 15 08:31:07 2025-04-15 14:31:07,793 INFO success: openwebui entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
                Apr 15 08:31:07 2025/04/15 14:31:07 routes.go:1231: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:2048 OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/media/ollama-vol/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
                Apr 15 08:31:07 Error: mkdir /media/ollama-vol/models/blobs: permission denied
                Apr 15 08:31:07 2025-04-15 14:31:07,823 WARN exited: ollama (exit status 1; not expected)
                Apr 15 08:31:09 2025-04-15 14:31:09,828 INFO spawned: 'ollama' with pid 43
                Apr 15 08:31:09 2025/04/15 14:31:09 routes.go:1231: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:2048 OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/media/ollama-vol/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
                Apr 15 08:31:09 Error: mkdir /media/ollama-vol/models/blobs: permission denied
                Apr 15 08:31:09 2025-04-15 14:31:09,862 WARN exited: ollama (exit status 1; not expected)
                Apr 15 08:31:10 => Healtheck error: Error: connect ECONNREFUSED 172.18.16.191:8080
                
                1 Reply Last reply
                0
                • N Offline
                  N Offline
                  ntnsndr
                  wrote on last edited by
                  #9

                  Relatedly (I am trying to try again with another volume), does anyone see if I'm doing something wrong here?

                  image.png

                  To create the /media/ollama-vol directory before I had to mkdir it manually before adding the volume.

                  Could there be something screwy with my filesystem permissions in general?

                  J 1 Reply Last reply
                  0
                  • BrutalBirdieB Offline
                    BrutalBirdieB Offline
                    BrutalBirdie
                    Partner
                    wrote on last edited by
                    #10

                    Did you ever try without the volume if it is working at all?
                    If no, that should be done first to minimize the error potential.

                    Like my work? Consider donating a drink. Cheers!

                    1 Reply Last reply
                    0
                    • N Offline
                      N Offline
                      ntnsndr
                      wrote on last edited by
                      #11

                      @BrutalBirdie yes, as noted above: "The problem does not occur under the default app settings. But it does when I use the recommended path of creating a separate volume for the models."

                      1 Reply Last reply
                      0
                      • BrutalBirdieB Offline
                        BrutalBirdieB Offline
                        BrutalBirdie
                        Partner
                        wrote on last edited by
                        #12

                        Huh 🤔
                        Then there must be something really strange going on.
                        Either perimissions on the server itself or some other ghost in the shell.

                        Like my work? Consider donating a drink. Cheers!

                        1 Reply Last reply
                        0
                        • N ntnsndr

                          Relatedly (I am trying to try again with another volume), does anyone see if I'm doing something wrong here?

                          image.png

                          To create the /media/ollama-vol directory before I had to mkdir it manually before adding the volume.

                          Could there be something screwy with my filesystem permissions in general?

                          J Offline
                          J Offline
                          joseph
                          Staff
                          wrote on last edited by
                          #13

                          @ntnsndr the Local Directory is something like /opt/ollama-models . You have to create this directory on the server manually . This prevents admins from mounting random paths (the idea being that only a person with SSH access can "create" paths).

                          In your initial post, what kind of volume are you using? I wonder if it's CIFS and it doesn't support file permissions ?

                          1 Reply Last reply
                          0
                          • N Offline
                            N Offline
                            ntnsndr
                            wrote on last edited by
                            #14

                            As I said above, I'm using the Filesystem mode for the volume. Should I use another?

                            1 Reply Last reply
                            0
                            • N Offline
                              N Offline
                              ntnsndr
                              wrote on last edited by
                              #15

                              Question: What should be the correct directory ownership and permissions setting for the ollama-home folder on a mounted Volume?

                              1 Reply Last reply
                              0
                              • A Offline
                                A Offline
                                abuyuy
                                wrote last edited by
                                #16

                                I'm struggling with the same issue, using an EXT4 disk as volume (additional disk passed to cloudron via proxmox).
                                Any hints on how the directory for the volume needs to be configured (permissions/ownership)?

                                1 Reply Last reply
                                0
                                Reply
                                • Reply as topic
                                Log in to reply
                                • Oldest to Newest
                                • Newest to Oldest
                                • Most Votes


                                • Login

                                • Don't have an account? Register

                                • Login or register to search.
                                • First post
                                  Last post
                                0
                                • Categories
                                • Recent
                                • Tags
                                • Popular
                                • Bookmarks
                                • Search