how to configure Ollama in OpenWebUI
-
Hi guys,
Could you please guide me how to configure Ollama in OpenWebUI so that I can download a model to test?
I am having below error after I installed OpenWebUI.
I tried to change localhost to my OpenWebUi domain but still same error. My OpenWebUI domain is behind cloudflare proxy.
-
Hello @harryz
The OpenWebUI Cloudron app no longer comes bundles with an internal ollama server.
Sorry, the documentation is outdated and I will update it accordingly.
Please install the Ollama app.
Inside the Ollama app web terminal you can run this command to pull a modell:ollama pull tinyllamaIn the Ollama app web terminal also run:
cat /app/data/.api_keyNote down this key, you will need it.
Then you need to configure the OpenWebUI app.
Login as an admin and navigate tohttps://YOUR.DOMAIN.TLD/admin/settings/connections:
Press the configuration button for Ollama:

Click on
localto change it toexternal:

Now when it shows
external(1) configure theURLto your Ollama app url and paste the key from before intoAuth

You can press the 'reload' icon to verify the connection:

If you see the success message in the top right, you should be good to go and click the
Savebutton.

Now you can use your own Ollama server and modells in OpenWebUI:
