@jdaviescoates kinda new here and didn't know that how it works. I did it now.
wheez
Posts
-
Huly : Open Source All-in-One Project Management Platform -
Huly : Open Source All-in-One Project Management Platformthis looks amazing! would love to have it on cloudron.
-
Access Ollama Base URL from n8n@girish Perfect thank you! This actually made it work for me!
Steps:
Generate an API key in your OpenWebUI.Add the header Authorization: Bearer {{api_key}} to a post request to {{OpenWebUI_URL}}/ollama/api/generate and a body of
{ "model": "llama2", "prompt": "Why is the sky blue?" }
From there I get the response via the api.
To use it with n8n I can just use the regular request node.
The URL
{{OpenWebUI_URL}}/ollama/
gets forwarded to the internal Ollama endpoint. So if you want to access other aspects of the Ollama api you can just use their documentation at https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-completion.
In the docs replace "localhost:11434/api/" with "{{OpenWebUI_URL}}/ollama/api/"
-
Access Ollama Base URL from n8n@girish yes I am running n8n in Cloudron. OpenWebUI & n8n are on the same server, both run by Cloudron. That's why I hoped to be able to use the internal networking to access Ollama from n8n.
-
Access Ollama Base URL from n8nI am trying out to connect to the included Ollama URL from n8n running on the same server. The Ollama Base URL given in the GUI is http://127.0.0.1:11434, which is only the local address.
I tried to see if I can use the internal docker networking to connect to that port. But somehow I am not able to find the container when I run "docker ps" on the server.
Same when I searched for the ID of the App in the docker network.
Can someone help me how to connect to it over the internal docker network routing?Thank you!