Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps - Status | Demo | Docs | Install
  1. Cloudron Forum
  2. Ollama
  3. Error: mkdir /root/.ollama: read-only file system

Error: mkdir /root/.ollama: read-only file system

Scheduled Pinned Locked Moved Ollama
4 Posts 3 Posters 17 Views 3 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • nostrdevN Offline
    nostrdevN Offline
    nostrdev
    wrote last edited by
    #1

    Getting the following in a fresh install after downloading the model:

    Setting the HOME variable does not help.

    root@4772c55c-cc53-4e23-b90e-d3d8b5687bab:/app/code# ollama run qwen3.5:35b
    Error: mkdir /root/.ollama: read-only file system
    
    1 Reply Last reply
    1
    • jamesJ Offline
      jamesJ Offline
      james
      Staff
      wrote last edited by james
      #2

      Hello @nostrdev

      Please use ollama pull $MODELL_NAME as documented https://docs.cloudron.io/packages/ollama#downloading-models
      This correctly downloads the model to /app/data/ollama-home/.

      Cloudron sets the following environment variable in the start.sh#L23:

      export OLLAMA_MODELS=/app/data/ollama-home/models
      

      Ollama is started with a supervisor service https://git.cloudron.io/packages/ollama-app/-/blob/master/supervisor/ollama.conf in which the HOME is set:

      environment=HOME=/app/data/ollama-home
      

      Thus, when running ollama from the web terminal, you are running Ollama as user root with its own environment variables.
      In order to use ollama run $MODELL_NAME you will need to run it the following way:

      gosu cloudron:cloudron bash -c 'HOME=/app/data/ollama-home ollama run $MODELL_NAME'
      

      That ollama run and pull behave so different is an upstream issue.
      IMHO, ollama run should also respect the set environment, but it does not and instead uses os.UserHomeDir()

      1 Reply Last reply
      1
      • jamesJ Offline
        jamesJ Offline
        james
        Staff
        wrote last edited by
        #3

        Now that I thought about it, it makes sense that ollama run behaves in such a way when you look at Ollama outside the Cloudron scope.

        Example, you have an office AI-Server that is running ollama serve and every user can use ollama run on their local shell connecting to the server.
        So user tina will write her own ollama run chat history to /home/tina/.ollama and user tim would write to /home/tim/.ollama.
        If it was unified to a server side HOME you would have to do the chat separation on the server level.
        And that is what the WebUI is for.

        This does also imply that when using:

        gosu cloudron:cloudron bash -c 'HOME=/app/data/ollama-home ollama run $MODELL_NAME'
        

        you will share the chat history with anyone who is also might have access to the web terminal.

        1 Reply Last reply
        0
        • nebulonN Offline
          nebulonN Offline
          nebulon
          Staff
          wrote last edited by
          #4

          Yes so I think the issue stems from the fact that ollama has a client/server architecture, where the Cloudron ollama app is acting as the server component (ollama serve) while if you run ollama run for example, it starts a client talking to that ollama server. The client here tries to save some info in the user's HOME which is /root here for the client.

          At least so far the package was thought of only acting as the server. This means one uses ollama in client mode on the dev environment or via agentic tools like opencode. Those would not run inside that ollama app instance, but would just connect to the ollama web api and run only the models in the app.

          1 Reply Last reply
          0

          Hello! It looks like you're interested in this conversation, but you don't have an account yet.

          Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.

          With your input, this post could be even better 💗

          Register Login
          Reply
          • Reply as topic
          Log in to reply
          • Oldest to Newest
          • Newest to Oldest
          • Most Votes


          • Login

          • Don't have an account? Register

          • Login or register to search.
          • First post
            Last post
          0
          • Categories
          • Recent
          • Tags
          • Popular
          • Bookmarks
          • Search