Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. LibreChat
  3. How to configure LibreChat to discuss with secured self hosted Ollama?

How to configure LibreChat to discuss with secured self hosted Ollama?

Scheduled Pinned Locked Moved Unsolved LibreChat
13 Posts 3 Posters 118 Views 3 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • jamesJ Offline
    jamesJ Offline
    james
    Staff
    wrote last edited by james
    #2

    Hello @SansGuidon

    I assume you took the Ollama bearer token from the /app/data/.bearer_token file of the Ollama app, correct?

    1 Reply Last reply
    1
    • SansGuidonS Offline
      SansGuidonS Offline
      SansGuidon
      wrote last edited by
      #3

      correct. I also tested that I can reach the Ollama api hosted on Cloudron using that same Bearer
      curl -v https://ollama-api.<REDACTED>/v1/models -H "Authorization: Bearer <REDACTED>"
      it does not return anything more than {"object":"list","data":null} useful but at least it's a 200 and a good enough test for me.

      About me / Now

      jamesJ 1 Reply Last reply
      1
      • SansGuidonS SansGuidon

        correct. I also tested that I can reach the Ollama api hosted on Cloudron using that same Bearer
        curl -v https://ollama-api.<REDACTED>/v1/models -H "Authorization: Bearer <REDACTED>"
        it does not return anything more than {"object":"list","data":null} useful but at least it's a 200 and a good enough test for me.

        jamesJ Offline
        jamesJ Offline
        james
        Staff
        wrote last edited by
        #4

        Hello @SansGuidon

        Your API request returned {"object":"list","data":null} which indicated no models found.

        For testing, in the web terminal of the Ollama app I send this command:

        ollama run tinyllama
        

        After that is done, now I get:

        curl https://ollama-api.cloudron.dev/v1/models -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhZG1pbiIsImlhdCI6MTc2MTgxNjM1OSwiZXhwIjoxNzYxODE5OTU5fQ.YLZtAuIjqnApthTBfuoPyyyJ5a7N2wywn2GW9dTqUeU"
        {"object":"list","data":[{"id":"tinyllama:latest","object":"model","created":1761817528,"owned_by":"library"}]}
        

        After reproducing your setup I also get the HTTP error 401.
        I suspect that the apiKey: config is wrong since we use bearer auth.
        And yes, that was it.

        Here is my working config for LibreChat:

        version: 1.2.8
        
        endpoints:
          custom:
            - name: "Ollama"
              baseURL: "https://ollama-api.cloudron.dev/v1/" 
              models:
                default: [
                  "tinyllama:latest"
                  ]
                fetch: true
              titleConvo: true
              titleModel: "current_model"
              summarize: false
              summaryModel: "current_model"
              forcePrompt: false
              modelDisplayLabel: "Ollama"
              headers:
                Authorization: "Bearer ${OLLAMA_BEARER_TOKEN}"
        

        Always happy to help.

        1 Reply Last reply
        0
        • jamesJ Offline
          jamesJ Offline
          james
          Staff
          wrote last edited by
          #5

          Oh, wait no. Missing the apiKey config it defaulted back to my configured GPT.
          Sorry, I will look into it.

          1 Reply Last reply
          1
          • jamesJ Offline
            jamesJ Offline
            james
            Staff
            wrote last edited by
            #6

            The LibreChat logs state on restart:

            Oct 30 11:23:19 2025-10-30 10:23:19 error: [indexSync] error fetch failed
            

            But in the Ollama logs for this timestamp I can see no request.
            But when trying to send a prompt, I get a request:

            Oct 30 11:24:15 172.18.0.1 - - [30/Oct/2025:10:24:15 +0000] "GET /api/tags HTTP/1.1" 401 188 "-" "axios/1.12.1"
            Oct 30 11:24:17 172.18.0.1 - - [30/Oct/2025:10:24:17 +0000] "POST /api/chat HTTP/1.1" 401 188 "-" "ollama-js/0.5.14 (x64 linux Node.js/v22.14.0)"
            

            So, something is astray here.

            1 Reply Last reply
            2
            • SansGuidonS Offline
              SansGuidonS Offline
              SansGuidon
              wrote last edited by
              #7

              Thanks @james

              About me / Now

              1 Reply Last reply
              1
              • jamesJ Offline
                jamesJ Offline
                james
                Staff
                wrote last edited by
                #8

                I was able to capture the headers that are send to Ollama.

                // request made when re-trying the prompt ⚠️the auth bearer header is missing
                POST /mirror HTTP/1.1
                X-Original-URI: /api/chat
                Host: 127.0.0.1:6677
                Connection: close
                Content-Length: 114
                Content-Type: application/json
                Accept: application/json
                User-Agent: ollama-js/0.5.14 (x64 linux Node.js/v22.14.0)
                accept-language: *
                sec-fetch-mode: cors
                accept-encoding: br, gzip, deflate
                
                // manual curl reuqest - header is there
                GET /mirror HTTP/1.1
                X-Original-URI: /v1/models
                Host: 127.0.0.1:6677
                Connection: close
                user-agent: curl/8.5.0
                accept: */*
                authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhZG1pbiIsImlhdCI6MTc2MTgxNjM1OSwiZXhwIjoxNzYxODE5OTU5fQ.YLZtAuIjqnApthTBfuoPyyyJ5a7N2wywn2GW9dTqUeU
                

                So for some reason the header is not send.

                1 Reply Last reply
                0
                • SansGuidonS Offline
                  SansGuidonS Offline
                  SansGuidon
                  wrote last edited by
                  #9

                  Thanks @james for confirming my suspicion. But Ollama is accepting the Header so I guess the problem is on LibreChat, right?

                  About me / Now

                  1 Reply Last reply
                  0
                  • jamesJ Offline
                    jamesJ Offline
                    james
                    Staff
                    wrote last edited by
                    #10

                    Hello @SansGuidon
                    Yes, that is what we assume now as well.
                    We have created a bug report on GitHub for this.

                    1 Reply Last reply
                    2
                    • nebulonN Offline
                      nebulonN Offline
                      nebulon
                      Staff
                      wrote last edited by
                      #11

                      Just to keep everyone updated, the github issue in question is https://github.com/danny-avila/LibreChat/issues/10311

                      1 Reply Last reply
                      2
                      • jamesJ Offline
                        jamesJ Offline
                        james
                        Staff
                        wrote last edited by james
                        #12

                        There was just a comment.

                        Thanks, this is specific to Ollama as it still uses some legacy code where headers are not expected.

                        You can already work around it by changing the custom endpoint name from "ollama" to something else, but I'm pushing a simple fix.

                        Tested this, config:

                        version: 1.2.8
                        
                        endpoints:
                          custom:
                            - name: "CloudronOllama"
                              apiKey: "ollama"
                              baseURL: "https://ollama-api.cloudron.dev/v1/"
                              models:
                                default: ["tinyllama:latest"]
                                fetch: true
                              titleConvo: true
                              titleModel: "current_model"
                              summarize: false
                              summaryModel: "current_model"
                              forcePrompt: false
                              modelDisplayLabel: "Ollama"
                              headers:
                                Content-Type: "application/json"
                                Authorization: "Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhZG1pbiIsImlhdCI6MTc2MTgxNjM1OSwiZXhwIjoxNzYxODE5OTU5fQ.YLZtAuIjqnApthTBfuoPyyyJ5a7N2wywn2GW9dTqUeU"
                        

                        And indeed, this worked! But danny-avila already created a PR to fix this issue.
                        So, to get it working right now, just change - name: "Ollama" to e.g. - name: "CloudronOllama"

                        39d2622a-3eb9-45d8-ab56-96a8a0cfe9d6-image.png

                        1 Reply Last reply
                        3
                        • SansGuidonS Offline
                          SansGuidonS Offline
                          SansGuidon
                          wrote last edited by
                          #13

                          That's working too for me by changing the name 😌 ! Thanks @james

                          About me / Now

                          1 Reply Last reply
                          2
                          Reply
                          • Reply as topic
                          Log in to reply
                          • Oldest to Newest
                          • Newest to Oldest
                          • Most Votes


                          • Login

                          • Don't have an account? Register

                          • Login or register to search.
                          • First post
                            Last post
                          0
                          • Categories
                          • Recent
                          • Tags
                          • Popular
                          • Bookmarks
                          • Search