Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. OpenWebUI
  3. Bundle with Pipelines?

Bundle with Pipelines?

Scheduled Pinned Locked Moved OpenWebUI
9 Posts 5 Posters 605 Views 5 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • LanhildL Offline
    LanhildL Offline
    Lanhild
    App Dev
    wrote on last edited by
    #1

    Even if IMO it's not appropriate for this package, the approach to bundle Ollama and Open WebUI together in the same package is an interesting one.

    I was wondering if instead of packaging Ollama with it, we could package Pipelines and Open WebUI.

    Pipelines in itself is really just a sandboxed runtime for python code, built for use with Open WebUI. It has no frontend, and the same dependencies as Open WebUI.

    I think this'll be worth debating when comes the time to publish pipelines to the app store.

    1 Reply Last reply
    1
    • J Offline
      J Offline
      jagan
      wrote on last edited by
      #2

      Can we please consider this?
      The virtual servers we run are hardly suited for good Ollama output anyways. Now that the tokens cost for lower models of OpenAI, Grok, Anthropic, Google, etc are low, there is actually no point to have Ollama bundled.

      Pipelines however, add a great deal of flexibility and functionality to OpenWebUI and can enhance it to no limit!

      Thank you

      1 Reply Last reply
      0
      • girishG Offline
        girishG Offline
        girish
        Staff
        wrote on last edited by
        #3

        I guess since it's not a webapp, we have to bundle it with the app itself. But I am not sure how flexible these peipelines are. Does one have to install many deps? i.e user can install whatever they want? Maybe @Lanhild knows. I wonder how that would fit into the cloudron packaging model

        LanhildL 1 Reply Last reply
        0
        • girishG girish

          I guess since it's not a webapp, we have to bundle it with the app itself. But I am not sure how flexible these peipelines are. Does one have to install many deps? i.e user can install whatever they want? Maybe @Lanhild knows. I wonder how that would fit into the cloudron packaging model

          LanhildL Offline
          LanhildL Offline
          Lanhild
          App Dev
          wrote on last edited by
          #4

          @girish Dependencies to install are specified in each pipelines or filters frontmatters. (the scripts that the actual software ingests)

          I've already made the package @ https://github.com/Lanhild/pipelines-cloudron and dependencies installation works.

          1 Reply Last reply
          1
          • firmansiF Offline
            firmansiF Offline
            firmansi
            wrote on last edited by
            #5

            @Lanhild is the pipeline can be installed with existing Openwebui in Cloudron? i don't see that kind of information in your Github

            1 Reply Last reply
            0
            • JOduMonTJ Offline
              JOduMonTJ Offline
              JOduMonT
              wrote on last edited by JOduMonT
              #6

              in the OpenWebUI Docs they don't recommend using the pipelines
              unless for advance usage case

              @jagan said in Bundle with Pipelines?:

              The virtual servers we run are hardly suited for good Ollama output anyways. Now that the tokens cost for lower models of OpenAI, Grok, Anthropic, Google, etc are low, there is actually no point to have Ollama bundled.

              if you want to enable others LLM; you can already do it with any providers that offer a compatible OpenAI API; which means virtually all of them.
              simply go in admin/settings -> Connections of your OpenWebUI instance and

              add the base URL API of your favorite provider

              • IE: https://openrouter.ai/api/v1 for OpenRouter

              brave_KqTVRxoZx0.png

              Then in Settings/Models you will have access to all the model provided by OpenRouter.

              brave_00q6UXPdNs.png

              NOTE: it also work at user level via DirectConnection

              In the same logic;

              I also managed to use Replicate.AI as ComfyUI instance and Jina.ai as Embedder and ReRanker

              firmansiF 1 Reply Last reply
              2
              • JOduMonTJ JOduMonT

                in the OpenWebUI Docs they don't recommend using the pipelines
                unless for advance usage case

                @jagan said in Bundle with Pipelines?:

                The virtual servers we run are hardly suited for good Ollama output anyways. Now that the tokens cost for lower models of OpenAI, Grok, Anthropic, Google, etc are low, there is actually no point to have Ollama bundled.

                if you want to enable others LLM; you can already do it with any providers that offer a compatible OpenAI API; which means virtually all of them.
                simply go in admin/settings -> Connections of your OpenWebUI instance and

                add the base URL API of your favorite provider

                • IE: https://openrouter.ai/api/v1 for OpenRouter

                brave_KqTVRxoZx0.png

                Then in Settings/Models you will have access to all the model provided by OpenRouter.

                brave_00q6UXPdNs.png

                NOTE: it also work at user level via DirectConnection

                In the same logic;

                I also managed to use Replicate.AI as ComfyUI instance and Jina.ai as Embedder and ReRanker

                firmansiF Offline
                firmansiF Offline
                firmansi
                wrote on last edited by
                #7

                @JOduMonT Can you share the info how to use Jina as embedder and reranker in OpenWebUI?

                JOduMonTJ 1 Reply Last reply
                3
                • firmansiF firmansi

                  @JOduMonT Can you share the info how to use Jina as embedder and reranker in OpenWebUI?

                  JOduMonTJ Offline
                  JOduMonTJ Offline
                  JOduMonT
                  wrote on last edited by
                  #8

                  @firmansi

                  docs

                  • embedding: https://jina.ai/api-dashboard/embedding

                  • reranker: https://jina.ai/api-dashboard/reranker

                  • jina endpoint: https://api.jina.ai/v1

                  brave_tBRb0ZL53M.png

                  1 Reply Last reply
                  1
                  • firmansiF Offline
                    firmansiF Offline
                    firmansi
                    wrote on last edited by firmansi
                    #9

                    @JOduMonT is the reranker using ollama or with Jina api?

                    1 Reply Last reply
                    0
                    Reply
                    • Reply as topic
                    Log in to reply
                    • Oldest to Newest
                    • Newest to Oldest
                    • Most Votes


                    • Login

                    • Don't have an account? Register

                    • Login or register to search.
                    • First post
                      Last post
                    0
                    • Categories
                    • Recent
                    • Tags
                    • Popular
                    • Bookmarks
                    • Search