Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content

OpenWebUI

76 Topics 706 Posts
  • ETA for GPU support? Could we contribute to help it along?

    22
    3 Votes
    22 Posts
    5k Views
    R
    @Lanhild said in ETA for GPU support? Could we contribute to help it along?: A lot of companies that might deploy Cloudron for its ease of life features don't necessarily have a VPS with a GPU. Also, (might help you to deepen your Cloudron knowledge) Cloudron packages usually are only one component/application. Moreover, OpenWebUI is "just" a UI that supports connections to Ollama and isn't affiliated with it. Meaning that Ollama isn't a dependency of it at all. Excellent points @Lanhild - you've convinced me. And there are benefits on the Ollama side too. I would appreciate the benefit in using Cloudron to keep our Ollama installation automatically up to date on its own, for instance. In fact, given our remaining inability to modify the existing Cloudron OpenWebUI app to run with our GPUs, for our small clients we are now thinking this way - I.e. using Cloudron just for the OpenWebUI component and letting them connect to our separately hosted Ollama. It's a bit less convenient than we were hoping, but at least we'll still have segregated data and user management for each client in OpenWebUI. So now, I also want a Cloudron OpenWebUI app that does not come with bundled Ollama, so that I can be sure these customers don't hammer our CPUs and get frustrated by a slow user experiences.
  • 0 Votes
    9 Posts
    3k Views
    C
    I resolved the issue. I had the directory set to /media/TSYSAI instead of just /media. Thanks!
  • Version 0.4.0

    6
    1 Votes
    6 Posts
    2k Views
    firmansiF
    @micmc yes, hopefully anythingllm can also come to live in Cloudron
  • Nice "smol" language model for OpenWebUI

    ai llm smol openwebui
    1
    1 Votes
    1 Posts
    257 Views
    No one has replied
  • can not use this app

    8
    0 Votes
    8 Posts
    3k Views
    ehson_mokhtaryE
    ok I think it was RAM issue when there is no enough RAM. it gives this error. thanks guys
  • models

    3
    0 Votes
    3 Posts
    1k Views
    mdc773M
    @micmc agreed i do use it on my computer but i was wanting to let my buddies play with it but hey thanks
  • OpenWebUI 0.3.35

    2
    0 Votes
    2 Posts
    580 Views
    J
    This is out now
  • Running Automatic111 or ComfyUI, is it possible?

    1
    0 Votes
    1 Posts
    225 Views
    No one has replied
  • Resource punkt_tab not found

    Solved
    7
    2
    0 Votes
    7 Posts
    3k Views
    nebulonN
    thanks I was able to reproduce and fix it. It will be part of the next package update
  • Web Search Issue with OpenWebUI 0.3.26

    Solved
    3
    0 Votes
    3 Posts
    382 Views
    firmansiF
    Awesome, now everything works just fine...salute for cloudron team
  • Websocket Redis Support

    2
    1 Votes
    2 Posts
    829 Views
    nebulonN
    We would have to adjust the package to pre-configure the app to use optionally redis. However this is only relevant for multi-instance setups which is anyways not the case for Cloudron and only relevant for instances with many concurrent users, which is likely not the main focus for this app on Cloudron. So so far this optional redis is out of scope. From the Changelog: Enhanced load balancing capabilities for multiple instance setups
  • Is the app still unstable?

    3
    1 Votes
    3 Posts
    909 Views
    girishG
    The Open Web UI project initially started out as a Ollama frontend. To run local models, one needs GPU support and Cloudron currently does not have GPU support. Over time, Open Web UI has shifed to be a frontend for OpenAI API compatible services . It is also now possible to configure Ollama in another computer which has a GPU and then point Open Web UI to use that Ollama. I guess we can mark this as stable since it's still useful while we figure out GPU support in Cloudron.
  • `NLTK_DATA` variable not working

    1
    0 Votes
    1 Posts
    261 Views
    No one has replied
  • RAG not working anymore

    3
    1
    0 Votes
    3 Posts
    975 Views
    LanhildL
    @vladimir-d In the future, I believe we should let the app handle most of it's configuration, environment-wise. Variables such as RAG_EMBEDDING_MODEL, etc. are all set via the application UI and saved in the config.json file. Now, for the initial issue of this topic, I've solved it by resetting the embedding model engine.
  • Latest Release 0.3.16

    Solved
    4
    0 Votes
    4 Posts
    980 Views
    J
    @firmansi not needed. apps are packaged as containers that have the right dependencies
  • Problem with JSON rResponse

    1
    0 Votes
    1 Posts
    264 Views
    No one has replied
  • is /data/docs the right path ?

    6
    1
    1 Votes
    6 Posts
    4k Views
    nebulonN
    @Jack-613 this is a Cloudron related forum, so if you are not using OpenWebUI on Cloudron, then you should probably ask in the upstream project instead.
  • Something seems broken / can't download models

    Solved
    8
    0 Votes
    8 Posts
    2k Views
    J
  • Continue.dev and OpenWebUI

    3
    1 Votes
    3 Posts
    2k Views
    marioM
    Ah ok. I was under the impression that ollama was listening just on the localhost inside this app. If its not, this would work then. I'll test.
  • Which models know most about Doughnut Economics?

    10
    3 Votes
    10 Posts
    3k Views
    humptydumptyH
    @eddowding said in Which models know most about Doughnut Economics?: Does it matter much if you can add docs to them and have them answer intelligently? I tried that yesterday. It didn't help in the slightest.