Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content

OpenWebUI

22 Topics 169 Posts
  • OpenWebUI - Package Updates

    Pinned
    43
    1 Votes
    43 Posts
    2k Views
    Package UpdatesP

    [2.3.8]

    Update Ollama to 0.3.0
  • Which models know most about Doughnut Economics?

    10
    3 Votes
    10 Posts
    284 Views
    humptydumptyH

    @eddowding said in Which models know most about Doughnut Economics?:

    Does it matter much if you can add docs to them and have them answer intelligently?

    I tried that yesterday. It didn't help in the slightest.

  • OpenWebUI + Anthropic / Claude

    Moved
    2
    2 Votes
    2 Posts
    64 Views
    E

    Answering my own question - yes - follow this: https://github.com/open-webui/open-webui/issues/3288

  • Web Search with Searxng

    7
    1 Votes
    7 Posts
    65 Views
    J

    @taowang maybe worth reporting upstream? I think config.py is source code and not a configurable file. Even in their docker it's not stashed.

  • Using Mistral API seems broken on cloudron

    Solved
    9
    1 Votes
    9 Posts
    185 Views
    micmcM

    @shrey said in Using Mistral API seems broken on cloudron:

    @nebulon Thanks!
    Indeed, that was it.

    RAM was set at 2GB. Whereas, it seems to require a min. of 5.5GB to function.

    That's right, and if you can provide it with at least 8-16GB of RAM you will see a huge difference. These things are meant to consume large resources for now, but it's getting better.

  • 0 Votes
    3 Posts
    51 Views
    J

    I already have a volume, but I will create one more and mount it!

  • 0 Votes
    2 Posts
    51 Views
    girishG

    Dup of https://forum.cloudron.io/topic/11988/openwebui-permissions-issue-after-last-2-2-5-update-cannot-create-directory-media-ollama-vol

  • LiteLLM removed from OpenWebUI, requires own separate container

    1
    0 Votes
    1 Posts
    93 Views
    No one has replied
  • When I try to upload documents: Permission denied: '/root/nltk_data'

    3
    1 Votes
    3 Posts
    80 Views
    girishG

    @JOduMonT yeah, I think you are on the old package. Please start another instance. In the latest package, we have fixed the NLTK download directory.

  • is /data/docs the right path ?

    4
    1 Votes
    4 Posts
    281 Views
    nebulonN

    To be clear /data/.. should not actually exist and is not writeable. I guess you mean /app/data/data/...?

  • Since new update i can't log in anymore

    5
    0 Votes
    5 Posts
    136 Views
    D

    @girish There is no problem. We lost all the content of our conversations on OpenWebUI, but we prefer that it happens now rather than in six months and after six months of use. More stability, we will always be for it and it is better that it is done at the beginning as now.

  • Seems to be some version issues with Ollama and OI

    2
    0 Votes
    2 Posts
    79 Views
    CptPlasticC

    This is what was updated in Cloudron

    image.png

  • PostgreSQL support

    4
    2 Votes
    4 Posts
    151 Views
    girishG

    @Lanhild thanks, will check it out. Apps using sqlite cannot be considered stable until we figure a proper backup strategy. Cloudron's backup strategy is to copy files and this doesn't work well with databases in the data directory. The data can be inconsistent.

    Also, given no automatic migration from sqlite to postgres, we might have to publish a completely new package

  • ldap integration with ollama open webui

    Moved
    2
    0 Votes
    2 Posts
    127 Views
    girishG

    What is the context for this question? Why is this a separate post?

  • The advanced settings are reset every time

    Solved
    16
    2 Votes
    16 Posts
    687 Views
    D

    @coniunctio Yes it has been resolved. Now it is working fine.

  • Should ollama be part of this app package?

    19
    7 Votes
    19 Posts
    2k Views
    JOduMonTJ

    personally I disabled the ollama local, because my Cloudron doesnt have GPU and on CPU it is too painfull.
    in exchange; I activated a bunch of Providers API compatible with OpenAI
    but at the end I realized that I just need OpenRouter to access all of them.
    image.png

    with OpenRouter, you could even block providers that logs your queries;
    which I will Feature Request for Open-WebUI
    image.png

  • Access Ollama Base URL from n8n

    Solved
    6
    1 Votes
    6 Posts
    2k Views
    W

    @girish Perfect thank you! This actually made it work for me!

    Steps:
    Generate an API key in your OpenWebUI.

    Add the header Authorization: Bearer {{api_key}} to a post request to {{OpenWebUI_URL}}/ollama/api/generate and a body of

    { "model": "llama2", "prompt": "Why is the sky blue?" }

    From there I get the response via the api.

    To use it with n8n I can just use the regular request node.

    The URL

    {{OpenWebUI_URL}}/ollama/

    gets forwarded to the internal Ollama endpoint. So if you want to access other aspects of the Ollama api you can just use their documentation at https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-completion.

    In the docs replace "localhost:11434/api/" with "{{OpenWebUI_URL}}/ollama/api/"

  • The models load in the void

    6
    1 Votes
    6 Posts
    323 Views
    girishG

    Just saw this post - LLaMA Now Goes Faster on CPUs

  • Can't open web UI

    Solved
    3
    1 Votes
    3 Posts
    255 Views
    Dennis4720 0D

    Correct, problem was solved shortly after posting this.

  • let's collect some metrics

    17
    4 Votes
    17 Posts
    648 Views
    L

    One thing I would like to have as an option is a bell sound when the generation has completed. It helps me be productive elsewhere instead of waiting.

    Oh, I would suggest overriding the initial memory allocation and ramping it up to as much RAM as you can spare.