Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. App Wishlist
  3. LiteLLM - OpenRouter Self-Hosted Alternative proxy provides access to OpenAI, Bedrock, Anthropic, Gemini, etc

LiteLLM - OpenRouter Self-Hosted Alternative proxy provides access to OpenAI, Bedrock, Anthropic, Gemini, etc

Scheduled Pinned Locked Moved App Wishlist
2 Posts 1 Posters 2.4k Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • J Offline
      J Offline
      jagan
      wrote on last edited by jagan
      #1

      • LiteLLM: LLM Gateway on Cloudron - provide model access, logging and usage tracking across 100+ LLMs. All in the OpenAI format.

      • Main Page: https://www.litellm.ai/
      • Git: https://github.com/BerriAI/litellm/
      • Licence: AGPL
      • Docker: Yes, Docker Packages
      • Demo: 7 Day Enterprise Trial
      • Discussion/Community: Discord

      • Summary: LiteLLM exposes an OpenAI compatible API that proxies requests to other LLM API services. This provides a standardized API to interact with both open-source and commercial LLMs.

      This can be a self-hosted alternative to OpenRouter.
      Any application - including OpenWebUI (on CLoudron) can use this OpenAI compatible API endpoint to access over 100 integrations from OpenAI, Amazon Bedrock, Anthropic models, Google Gemini models, Grok, Deepseek, etc.

      • 100+ LLM Provider Integrations
      • Langfuse, Langsmith, OTEL Logging
      • Virtual Keys, Budgets, Teams
      • Load Balancing, RPM/TPM limits
      • LLM Guardrails

      • Notes: This would make any of the AI on cloudron or outside (e.g. Khoj) very powerful, bridge the divide between models as they are released and afford us access easily through a single API.

      This is superpower, AI on steroids, giving us access to leverage the comparive advantage of each AI model (e.g. Claude for programming, Deepseek for cheapness and Bedrock for longer context window and memory, etc.


      • Alternative to / Libhunt link: OpenRouter.ai
      • Installation Instructions: https://docs.litellm.ai/docs/proxy/docker_quick_start
      1 Reply Last reply
      5
      • J Offline
        J Offline
        jagan
        wrote on last edited by
        #2

        LiteLLM was earlier integrated into OpenWebUI and was marvellous.
        It was removed recently and I had posted about it here: https://forum.cloudron.io/topic/11957/litellm-removed-from-openwebui-requires-own-separate-container

        Now, having successfully tested the docker, wish it could be on Cloudron and can act as proxy for all of our AI applications that require API access to any AI provider.

        This would be fantastic - one application to rule them all!

        1 Reply Last reply
        3
        • J jagan referenced this topic on
        • J jagan referenced this topic on
        Reply
        • Reply as topic
        Log in to reply
        • Oldest to Newest
        • Newest to Oldest
        • Most Votes


          • Login

          • Don't have an account? Register

          • Login or register to search.
          • First post
            Last post
          0
          • Categories
          • Recent
          • Tags
          • Popular
          • Bookmarks
          • Search