There was just a comment.
Thanks, this is specific to Ollama as it still uses some legacy code where headers are not expected.
You can already work around it by changing the custom endpoint name from "ollama" to something else, but I'm pushing a simple fix.
Tested this, config:
version: 1.2.8
endpoints:
custom:
- name: "CloudronOllama"
apiKey: "ollama"
baseURL: "https://ollama-api.cloudron.dev/v1/"
models:
default: ["tinyllama:latest"]
fetch: true
titleConvo: true
titleModel: "current_model"
summarize: false
summaryModel: "current_model"
forcePrompt: false
modelDisplayLabel: "Ollama"
headers:
Content-Type: "application/json"
Authorization: "Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhZG1pbiIsImlhdCI6MTc2MTgxNjM1OSwiZXhwIjoxNzYxODE5OTU5fQ.YLZtAuIjqnApthTBfuoPyyyJ5a7N2wywn2GW9dTqUeU"
And indeed, this worked! But danny-avila already created a PR to fix this issue.
So, to get it working right now, just change - name: "Ollama" to e.g. - name: "CloudronOllama"
[image: 1761848231499-39d2622a-3eb9-45d8-ab56-96a8a0cfe9d6-image-resized.png]