Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. LibreChat
  3. How to configure LibreChat to discuss with secured self hosted Ollama?

How to configure LibreChat to discuss with secured self hosted Ollama?

Scheduled Pinned Locked Moved Unsolved LibreChat
13 Posts 3 Posters 67 Views 3 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • SansGuidonS Offline
    SansGuidonS Offline
    SansGuidon
    wrote last edited by SansGuidon
    #1

    Hi
    I've installed LibreChat and Ollama on Cloudron and I've followed the checklists for each app.
    LibreChat discusses with Mistral without trouble using my API Key.
    However I can't make it work with my own instance of Ollama and I don't know what I'm doing wrong.
    I get this error from LibreChat

    Something went wrong. Here's the specific error message we encountered:
    
    An error occurred while processing the request: 
    <html> <head><title>401 Authorization Required</title></head> <body> <center><h1>401 Authorization Required</h1></center> <hr><center>nginx/1.24.0 (Ubuntu)</center> </body> </html>
    

    I guess a Bearer is missing, but even if I add it to librechat.yaml configuration, the problem remains.

    Any shared experience/example would help, I found the documentation very light about this issue, and the examples I see on how ton configure LibreChat for Ollama in other contexts show various base URLs for Ollama and sometimes use 'ollama' as api key, sometimes provide a Bearer token.

    My librechat.yaml config:

    version: 1.2.8
    
    endpoints:
      custom:
        - name: "Ollama"
          apiKey: "ollama"
          baseURL: "https://ollama-api.<REDACTED>/v1"
          models:
            default:
              - "llama3:latest"
              - "mistral"
              - "gemma:7b"
              - "phi3:mini"
            fetch: false
          headers:
            Authorization: 'Bearer ${OLLAMA_BEARER_TOKEN}'
          titleConvo: true
          titleModel: "current_model"
          summarize: false
          summaryModel: "current_model"
          forcePrompt: false
          modelDisplayLabel: "Ollama"
    

    Does it look correct? the OLLAMA_BEARER_TOKEN is defined in the .env file.

    Thanks in advance

    About me / Now

    1 Reply Last reply
    1
    • SansGuidonS SansGuidon marked this topic as a question
    • jamesJ Offline
      jamesJ Offline
      james
      Staff
      wrote last edited by james
      #2

      Hello @SansGuidon

      I assume you took the Ollama bearer token from the /app/data/.bearer_token file of the Ollama app, correct?

      1 Reply Last reply
      1
      • SansGuidonS Offline
        SansGuidonS Offline
        SansGuidon
        wrote last edited by
        #3

        correct. I also tested that I can reach the Ollama api hosted on Cloudron using that same Bearer
        curl -v https://ollama-api.<REDACTED>/v1/models -H "Authorization: Bearer <REDACTED>"
        it does not return anything more than {"object":"list","data":null} useful but at least it's a 200 and a good enough test for me.

        About me / Now

        jamesJ 1 Reply Last reply
        1
        • SansGuidonS SansGuidon

          correct. I also tested that I can reach the Ollama api hosted on Cloudron using that same Bearer
          curl -v https://ollama-api.<REDACTED>/v1/models -H "Authorization: Bearer <REDACTED>"
          it does not return anything more than {"object":"list","data":null} useful but at least it's a 200 and a good enough test for me.

          jamesJ Offline
          jamesJ Offline
          james
          Staff
          wrote last edited by
          #4

          Hello @SansGuidon

          Your API request returned {"object":"list","data":null} which indicated no models found.

          For testing, in the web terminal of the Ollama app I send this command:

          ollama run tinyllama
          

          After that is done, now I get:

          curl https://ollama-api.cloudron.dev/v1/models -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhZG1pbiIsImlhdCI6MTc2MTgxNjM1OSwiZXhwIjoxNzYxODE5OTU5fQ.YLZtAuIjqnApthTBfuoPyyyJ5a7N2wywn2GW9dTqUeU"
          {"object":"list","data":[{"id":"tinyllama:latest","object":"model","created":1761817528,"owned_by":"library"}]}
          

          After reproducing your setup I also get the HTTP error 401.
          I suspect that the apiKey: config is wrong since we use bearer auth.
          And yes, that was it.

          Here is my working config for LibreChat:

          version: 1.2.8
          
          endpoints:
            custom:
              - name: "Ollama"
                baseURL: "https://ollama-api.cloudron.dev/v1/" 
                models:
                  default: [
                    "tinyllama:latest"
                    ]
                  fetch: true
                titleConvo: true
                titleModel: "current_model"
                summarize: false
                summaryModel: "current_model"
                forcePrompt: false
                modelDisplayLabel: "Ollama"
                headers:
                  Authorization: "Bearer ${OLLAMA_BEARER_TOKEN}"
          

          Always happy to help.

          1 Reply Last reply
          0
          • jamesJ Offline
            jamesJ Offline
            james
            Staff
            wrote last edited by
            #5

            Oh, wait no. Missing the apiKey config it defaulted back to my configured GPT.
            Sorry, I will look into it.

            1 Reply Last reply
            1
            • jamesJ Offline
              jamesJ Offline
              james
              Staff
              wrote last edited by
              #6

              The LibreChat logs state on restart:

              Oct 30 11:23:19 2025-10-30 10:23:19 error: [indexSync] error fetch failed
              

              But in the Ollama logs for this timestamp I can see no request.
              But when trying to send a prompt, I get a request:

              Oct 30 11:24:15 172.18.0.1 - - [30/Oct/2025:10:24:15 +0000] "GET /api/tags HTTP/1.1" 401 188 "-" "axios/1.12.1"
              Oct 30 11:24:17 172.18.0.1 - - [30/Oct/2025:10:24:17 +0000] "POST /api/chat HTTP/1.1" 401 188 "-" "ollama-js/0.5.14 (x64 linux Node.js/v22.14.0)"
              

              So, something is astray here.

              1 Reply Last reply
              2
              • SansGuidonS Offline
                SansGuidonS Offline
                SansGuidon
                wrote last edited by
                #7

                Thanks @james

                About me / Now

                1 Reply Last reply
                1
                • jamesJ Offline
                  jamesJ Offline
                  james
                  Staff
                  wrote last edited by
                  #8

                  I was able to capture the headers that are send to Ollama.

                  // request made when re-trying the prompt ⚠️the auth bearer header is missing
                  POST /mirror HTTP/1.1
                  X-Original-URI: /api/chat
                  Host: 127.0.0.1:6677
                  Connection: close
                  Content-Length: 114
                  Content-Type: application/json
                  Accept: application/json
                  User-Agent: ollama-js/0.5.14 (x64 linux Node.js/v22.14.0)
                  accept-language: *
                  sec-fetch-mode: cors
                  accept-encoding: br, gzip, deflate
                  
                  // manual curl reuqest - header is there
                  GET /mirror HTTP/1.1
                  X-Original-URI: /v1/models
                  Host: 127.0.0.1:6677
                  Connection: close
                  user-agent: curl/8.5.0
                  accept: */*
                  authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhZG1pbiIsImlhdCI6MTc2MTgxNjM1OSwiZXhwIjoxNzYxODE5OTU5fQ.YLZtAuIjqnApthTBfuoPyyyJ5a7N2wywn2GW9dTqUeU
                  

                  So for some reason the header is not send.

                  1 Reply Last reply
                  0
                  • SansGuidonS Offline
                    SansGuidonS Offline
                    SansGuidon
                    wrote last edited by
                    #9

                    Thanks @james for confirming my suspicion. But Ollama is accepting the Header so I guess the problem is on LibreChat, right?

                    About me / Now

                    1 Reply Last reply
                    0
                    • jamesJ Offline
                      jamesJ Offline
                      james
                      Staff
                      wrote last edited by
                      #10

                      Hello @SansGuidon
                      Yes, that is what we assume now as well.
                      We have created a bug report on GitHub for this.

                      1 Reply Last reply
                      2
                      • nebulonN Away
                        nebulonN Away
                        nebulon
                        Staff
                        wrote last edited by
                        #11

                        Just to keep everyone updated, the github issue in question is https://github.com/danny-avila/LibreChat/issues/10311

                        1 Reply Last reply
                        2
                        • jamesJ Offline
                          jamesJ Offline
                          james
                          Staff
                          wrote last edited by james
                          #12

                          There was just a comment.

                          Thanks, this is specific to Ollama as it still uses some legacy code where headers are not expected.

                          You can already work around it by changing the custom endpoint name from "ollama" to something else, but I'm pushing a simple fix.

                          Tested this, config:

                          version: 1.2.8
                          
                          endpoints:
                            custom:
                              - name: "CloudronOllama"
                                apiKey: "ollama"
                                baseURL: "https://ollama-api.cloudron.dev/v1/"
                                models:
                                  default: ["tinyllama:latest"]
                                  fetch: true
                                titleConvo: true
                                titleModel: "current_model"
                                summarize: false
                                summaryModel: "current_model"
                                forcePrompt: false
                                modelDisplayLabel: "Ollama"
                                headers:
                                  Content-Type: "application/json"
                                  Authorization: "Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhZG1pbiIsImlhdCI6MTc2MTgxNjM1OSwiZXhwIjoxNzYxODE5OTU5fQ.YLZtAuIjqnApthTBfuoPyyyJ5a7N2wywn2GW9dTqUeU"
                          

                          And indeed, this worked! But danny-avila already created a PR to fix this issue.
                          So, to get it working right now, just change - name: "Ollama" to e.g. - name: "CloudronOllama"

                          39d2622a-3eb9-45d8-ab56-96a8a0cfe9d6-image.png

                          1 Reply Last reply
                          3
                          • SansGuidonS Offline
                            SansGuidonS Offline
                            SansGuidon
                            wrote last edited by
                            #13

                            That's working too for me by changing the name 😌 ! Thanks @james

                            About me / Now

                            1 Reply Last reply
                            2
                            Reply
                            • Reply as topic
                            Log in to reply
                            • Oldest to Newest
                            • Newest to Oldest
                            • Most Votes


                            • Login

                            • Don't have an account? Register

                            • Login or register to search.
                            • First post
                              Last post
                            0
                            • Categories
                            • Recent
                            • Tags
                            • Popular
                            • Bookmarks
                            • Search