Access Ollama Base URL from n8n
-
I am trying out to connect to the included Ollama URL from n8n running on the same server. The Ollama Base URL given in the GUI is http://127.0.0.1:11434, which is only the local address.
I tried to see if I can use the internal docker networking to connect to that port. But somehow I am not able to find the container when I run "docker ps" on the server.
Same when I searched for the ID of the App in the docker network.
Can someone help me how to connect to it over the internal docker network routing?Thank you!
-
@wheez are you using Cloudron to deploy the apps? If not, you have to ask the upstream n8n forum about this for other deployments - https://community.n8n.io/
-
@wheez the ollama is currently local to the open web ui container, so it's not accessible even via internal network.
But I see that openweb ui itself has some API. I generated a key (from settings UI), downloaded mistral (also from the settings ui) and then:
$ curl -H "Authorization: Bearer sk-c1da72e682cf48c992d997b0e1e57fb9" https://xapi.smartserver.io/ollama/api/tags {"models":[{"name":"mistral:7b","model":"mistral:7b","modified_at":"2024-04-21T15:56:00.887039891Z","size":4109865159,"digest":"61e88e884507ba5e06c49b40e6226884b2a16e872382c2b44a42f2d119d804a5","details":{"parent_model":"","format":"gguf","family":"llama","families":["llama"],"parameter_size":"7B","quantization_level":"Q4_0"},"urls":[0]}]}I got the above hint from https://github.com/open-webui/open-webui/discussions/1349 but not sure how far you will go after the above API call .
-
Interested in that question too. I'd like to deploy OpenWebUI and n8n on Cloudron, but I guess that what you are looking for is a http n8n endpoint to connect to OpenWebUI ?
Such endpoint would then use the n8n workflows ?
I guess their could be multiple n8n endpoints to integrate into OpenWebUI to get different results ? -
@wheez the ollama is currently local to the open web ui container, so it's not accessible even via internal network.
But I see that openweb ui itself has some API. I generated a key (from settings UI), downloaded mistral (also from the settings ui) and then:
$ curl -H "Authorization: Bearer sk-c1da72e682cf48c992d997b0e1e57fb9" https://xapi.smartserver.io/ollama/api/tags {"models":[{"name":"mistral:7b","model":"mistral:7b","modified_at":"2024-04-21T15:56:00.887039891Z","size":4109865159,"digest":"61e88e884507ba5e06c49b40e6226884b2a16e872382c2b44a42f2d119d804a5","details":{"parent_model":"","format":"gguf","family":"llama","families":["llama"],"parameter_size":"7B","quantization_level":"Q4_0"},"urls":[0]}]}I got the above hint from https://github.com/open-webui/open-webui/discussions/1349 but not sure how far you will go after the above API call .
@girish Perfect thank you! This actually made it work for me!
Steps:
Generate an API key in your OpenWebUI.Add the header Authorization: Bearer {{api_key}} to a post request to {{OpenWebUI_URL}}/ollama/api/generate and a body of
{ "model": "llama2", "prompt": "Why is the sky blue?" }From there I get the response via the api.
To use it with n8n I can just use the regular request node.
The URL
{{OpenWebUI_URL}}/ollama/gets forwarded to the internal Ollama endpoint. So if you want to access other aspects of the Ollama api you can just use their documentation at https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-completion.
In the docs replace "localhost:11434/api/" with "{{OpenWebUI_URL}}/ollama/api/"
-
G girish has marked this topic as solved on
-
M mononym referenced this topic on
Hello! It looks like you're interested in this conversation, but you don't have an account yet.
Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.
With your input, this post could be even better 💗
Register Login