Access Ollama Base URL from n8n
-
I am trying out to connect to the included Ollama URL from n8n running on the same server. The Ollama Base URL given in the GUI is http://127.0.0.1:11434, which is only the local address.
I tried to see if I can use the internal docker networking to connect to that port. But somehow I am not able to find the container when I run "docker ps" on the server.
Same when I searched for the ID of the App in the docker network.
Can someone help me how to connect to it over the internal docker network routing?Thank you!
-
@wheez the ollama is currently local to the open web ui container, so it's not accessible even via internal network.
But I see that openweb ui itself has some API. I generated a key (from settings UI), downloaded mistral (also from the settings ui) and then:
$ curl -H "Authorization: Bearer sk-c1da72e682cf48c992d997b0e1e57fb9" https://xapi.smartserver.io/ollama/api/tags {"models":[{"name":"mistral:7b","model":"mistral:7b","modified_at":"2024-04-21T15:56:00.887039891Z","size":4109865159,"digest":"61e88e884507ba5e06c49b40e6226884b2a16e872382c2b44a42f2d119d804a5","details":{"parent_model":"","format":"gguf","family":"llama","families":["llama"],"parameter_size":"7B","quantization_level":"Q4_0"},"urls":[0]}]}
I got the above hint from https://github.com/open-webui/open-webui/discussions/1349 but not sure how far you will go after the above API call .
-
Interested in that question too. I'd like to deploy OpenWebUI and n8n on Cloudron, but I guess that what you are looking for is a http n8n endpoint to connect to OpenWebUI ?
Such endpoint would then use the n8n workflows ?
I guess their could be multiple n8n endpoints to integrate into OpenWebUI to get different results ? -
@girish Perfect thank you! This actually made it work for me!
Steps:
Generate an API key in your OpenWebUI.Add the header Authorization: Bearer {{api_key}} to a post request to {{OpenWebUI_URL}}/ollama/api/generate and a body of
{ "model": "llama2", "prompt": "Why is the sky blue?" }
From there I get the response via the api.
To use it with n8n I can just use the regular request node.
The URL
{{OpenWebUI_URL}}/ollama/
gets forwarded to the internal Ollama endpoint. So if you want to access other aspects of the Ollama api you can just use their documentation at https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-completion.
In the docs replace "localhost:11434/api/" with "{{OpenWebUI_URL}}/ollama/api/"
-