How to configure LibreChat to discuss with secured self hosted Ollama?
- 
correct. I also tested that I can reach the Ollama api hosted on Cloudron using that same Bearer 
 curl -v https://ollama-api.<REDACTED>/v1/models -H "Authorization: Bearer <REDACTED>"
 it does not return anything more than{"object":"list","data":null}useful but at least it's a 200 and a good enough test for me.Hello @SansGuidon Your API request returned {"object":"list","data":null}which indicated no models found.For testing, in the web terminal of the Ollama app I send this command: ollama run tinyllamaAfter that is done, now I get: curl https://ollama-api.cloudron.dev/v1/models -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhZG1pbiIsImlhdCI6MTc2MTgxNjM1OSwiZXhwIjoxNzYxODE5OTU5fQ.YLZtAuIjqnApthTBfuoPyyyJ5a7N2wywn2GW9dTqUeU" {"object":"list","data":[{"id":"tinyllama:latest","object":"model","created":1761817528,"owned_by":"library"}]}After reproducing your setup I also get the HTTP error 401. 
 I suspect that theapiKey:config is wrong since we use bearer auth.
 And yes, that was it.Here is my working config for LibreChat: version: 1.2.8 endpoints: custom: - name: "Ollama" baseURL: "https://ollama-api.cloudron.dev/v1/" models: default: [ "tinyllama:latest" ] fetch: true titleConvo: true titleModel: "current_model" summarize: false summaryModel: "current_model" forcePrompt: false modelDisplayLabel: "Ollama" headers: Authorization: "Bearer ${OLLAMA_BEARER_TOKEN}"Always happy to help. 
- 
The LibreChat logs state on restart: Oct 30 11:23:19 2025-10-30 10:23:19 error: [indexSync] error fetch failedBut in the Ollama logs for this timestamp I can see no request. 
 But when trying to send a prompt, I get a request:Oct 30 11:24:15 172.18.0.1 - - [30/Oct/2025:10:24:15 +0000] "GET /api/tags HTTP/1.1" 401 188 "-" "axios/1.12.1" Oct 30 11:24:17 172.18.0.1 - - [30/Oct/2025:10:24:17 +0000] "POST /api/chat HTTP/1.1" 401 188 "-" "ollama-js/0.5.14 (x64 linux Node.js/v22.14.0)"So, something is astray here. 
- 
Thanks @james 
- 
I was able to capture the headers that are send to Ollama. // request made when re-trying the prompt β οΈthe auth bearer header is missing POST /mirror HTTP/1.1 X-Original-URI: /api/chat Host: 127.0.0.1:6677 Connection: close Content-Length: 114 Content-Type: application/json Accept: application/json User-Agent: ollama-js/0.5.14 (x64 linux Node.js/v22.14.0) accept-language: * sec-fetch-mode: cors accept-encoding: br, gzip, deflate // manual curl reuqest - header is there GET /mirror HTTP/1.1 X-Original-URI: /v1/models Host: 127.0.0.1:6677 Connection: close user-agent: curl/8.5.0 accept: */* authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhZG1pbiIsImlhdCI6MTc2MTgxNjM1OSwiZXhwIjoxNzYxODE5OTU5fQ.YLZtAuIjqnApthTBfuoPyyyJ5a7N2wywn2GW9dTqUeUSo for some reason the header is not send. 
- 
Thanks @james for confirming my suspicion. But Ollama is accepting the Header so I guess the problem is on LibreChat, right? 
- 
There was just a comment. Thanks, this is specific to Ollama as it still uses some legacy code where headers are not expected. You can already work around it by changing the custom endpoint name from "ollama" to something else, but I'm pushing a simple fix. Tested this, config: version: 1.2.8 endpoints: custom: - name: "CloudronOllama" apiKey: "ollama" baseURL: "https://ollama-api.cloudron.dev/v1/" models: default: ["tinyllama:latest"] fetch: true titleConvo: true titleModel: "current_model" summarize: false summaryModel: "current_model" forcePrompt: false modelDisplayLabel: "Ollama" headers: Content-Type: "application/json" Authorization: "Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhZG1pbiIsImlhdCI6MTc2MTgxNjM1OSwiZXhwIjoxNzYxODE5OTU5fQ.YLZtAuIjqnApthTBfuoPyyyJ5a7N2wywn2GW9dTqUeU"And indeed, this worked! But danny-avilaalready created a PR to fix this issue.
 So, to get it working right now, just change- name: "Ollama"to e.g.- name: "CloudronOllama" 
- 
That's working too for me by changing the name  ! Thanks @james ! Thanks @james
 


