<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[LiteLLM - OpenRouter Self-Hosted Alternative proxy provides access to OpenAI, Bedrock, Anthropic, Gemini, etc]]></title><description><![CDATA[<hr />
<ul>
<li><strong>LiteLLM</strong>: LLM Gateway on Cloudron - provide model access, logging and usage tracking across 100+ LLMs. All in the OpenAI format.</li>
</ul>
<hr />
<ul>
<li><strong>Main Page</strong>: <a href="https://www.litellm.ai/" target="_blank" rel="noopener noreferrer nofollow ugc">https://www.litellm.ai/</a></li>
<li><strong>Git</strong>: <a href="https://github.com/BerriAI/litellm/" target="_blank" rel="noopener noreferrer nofollow ugc">https://github.com/BerriAI/litellm/</a></li>
<li><strong>Licence</strong>: AGPL</li>
<li><strong>Docker</strong>: Yes, <a href="https://github.com/orgs/BerriAI/packages" target="_blank" rel="noopener noreferrer nofollow ugc">Docker Packages</a></li>
<li><strong>Demo</strong>: <a href="https://www.litellm.ai/#trial" target="_blank" rel="noopener noreferrer nofollow ugc">7 Day Enterprise Trial</a></li>
<li><strong>Discussion/Community</strong>: <a href="https://discord.com/invite/wuPM9dRgDw" target="_blank" rel="noopener noreferrer nofollow ugc">Discord</a></li>
</ul>
<hr />
<ul>
<li><strong>Summary</strong>: LiteLLM exposes an OpenAI compatible API that proxies requests to other LLM API services. This provides a standardized API to interact with both open-source and commercial LLMs.</li>
</ul>
<p dir="auto">This can be a self-hosted alternative to OpenRouter.<br />
Any application - including OpenWebUI (on CLoudron) can use this OpenAI compatible API endpoint to access over 100 integrations from OpenAI, Amazon Bedrock, Anthropic models, Google Gemini models, Grok, Deepseek, etc.</p>
<ul>
<li><a href="https://docs.litellm.ai/docs/providers" target="_blank" rel="noopener noreferrer nofollow ugc">100+ LLM Provider Integrations</a></li>
<li>Langfuse, Langsmith, OTEL Logging</li>
<li>Virtual Keys, Budgets, Teams</li>
<li>Load Balancing, RPM/TPM limits</li>
<li>LLM Guardrails</li>
</ul>
<hr />
<ul>
<li><strong>Notes</strong>:  This would make any of the AI on cloudron or outside (e.g. Khoj) very powerful, bridge the divide between models as they are released and afford us access easily through a single API.</li>
</ul>
<p dir="auto">This is superpower, AI on steroids, giving us access to leverage the comparive advantage of each AI model (e.g. Claude for programming, Deepseek for cheapness and Bedrock for longer context window and memory, etc.</p>
<hr />
<ul>
<li><strong>Alternative to / Libhunt link</strong>: <a href="http://OpenRouter.ai" target="_blank" rel="noopener noreferrer nofollow ugc">OpenRouter.ai</a></li>
<li><strong>Installation Instructions</strong>: <a href="https://docs.litellm.ai/docs/proxy/docker_quick_start" target="_blank" rel="noopener noreferrer nofollow ugc">https://docs.litellm.ai/docs/proxy/docker_quick_start</a></li>
</ul>
]]></description><link>https://forum.cloudron.io/topic/13141/litellm-openrouter-self-hosted-alternative-proxy-provides-access-to-openai-bedrock-anthropic-gemini-etc</link><generator>RSS for Node</generator><lastBuildDate>Sat, 14 Mar 2026 09:25:07 GMT</lastBuildDate><atom:link href="https://forum.cloudron.io/topic/13141.rss" rel="self" type="application/rss+xml"/><pubDate>Sat, 18 Jan 2025 03:54:48 GMT</pubDate><ttl>60</ttl><item><title><![CDATA[Reply to LiteLLM - OpenRouter Self-Hosted Alternative proxy provides access to OpenAI, Bedrock, Anthropic, Gemini, etc on Sat, 21 Feb 2026 20:54:50 GMT]]></title><description><![CDATA[<p dir="auto"><a class="plugin-mentions-user plugin-mentions-a" href="/user/marcusquinn" aria-label="Profile: marcusquinn">@<bdi>marcusquinn</bdi></a> Worth a look? Seems to be better in theory? Did you maybe tried both, does any-llm offer sames functionalities?</p>
]]></description><link>https://forum.cloudron.io/post/120565</link><guid isPermaLink="true">https://forum.cloudron.io/post/120565</guid><dc:creator><![CDATA[growco]]></dc:creator><pubDate>Sat, 21 Feb 2026 20:54:50 GMT</pubDate></item><item><title><![CDATA[Reply to LiteLLM - OpenRouter Self-Hosted Alternative proxy provides access to OpenAI, Bedrock, Anthropic, Gemini, etc on Sun, 27 Jul 2025 14:50:12 GMT]]></title><description><![CDATA[<p dir="auto"><a class="plugin-mentions-user plugin-mentions-a" href="/user/necrevistonnezr" aria-label="Profile: necrevistonnezr">@<bdi>necrevistonnezr</bdi></a> Nice find. That is very interesting!</p>
]]></description><link>https://forum.cloudron.io/post/110752</link><guid isPermaLink="true">https://forum.cloudron.io/post/110752</guid><dc:creator><![CDATA[marcusquinn]]></dc:creator><pubDate>Sun, 27 Jul 2025 14:50:12 GMT</pubDate></item><item><title><![CDATA[Reply to LiteLLM - OpenRouter Self-Hosted Alternative proxy provides access to OpenAI, Bedrock, Anthropic, Gemini, etc on Sun, 27 Jul 2025 05:37:08 GMT]]></title><description><![CDATA[<p dir="auto">LiteLLM is quite interesting as it has a Presidio plugin (<a href="https://docs.litellm.ai/docs/proxy/guardrails/pii_masking_v2" target="_blank" rel="noopener noreferrer nofollow ugc">https://docs.litellm.ai/docs/proxy/guardrails/pii_masking_v2</a>), i.e. it can mask PII (Personally Identifiable Information), PHI (Protected Health Information), and other sensitive data <em>before</em> sending data to a LLM - think anonymizing a document before having it analyzed by an LLM. That can make Gemini, Claude etc usable under GDPR jurisdiction.</p>
]]></description><link>https://forum.cloudron.io/post/110744</link><guid isPermaLink="true">https://forum.cloudron.io/post/110744</guid><dc:creator><![CDATA[necrevistonnezr]]></dc:creator><pubDate>Sun, 27 Jul 2025 05:37:08 GMT</pubDate></item><item><title><![CDATA[Reply to LiteLLM - OpenRouter Self-Hosted Alternative proxy provides access to OpenAI, Bedrock, Anthropic, Gemini, etc on Sun, 27 Jul 2025 03:20:40 GMT]]></title><description><![CDATA[<p dir="auto"><a class="plugin-mentions-user plugin-mentions-a" href="/user/jagan" aria-label="Profile: jagan">@<bdi>jagan</bdi></a> Might also be worth a look:</p>
<ul>
<li><a href="https://github.com/mozilla-ai/any-llm" target="_blank" rel="noopener noreferrer nofollow ugc">https://github.com/mozilla-ai/any-llm</a></li>
</ul>
<p dir="auto"><img src="/assets/uploads/files/1753586436183-d63514c2-50a9-40a3-82ac-ff6a5cbab621-image-resized.png" alt="d63514c2-50a9-40a3-82ac-ff6a5cbab621-image.png" class=" img-fluid img-markdown" /></p>
]]></description><link>https://forum.cloudron.io/post/110739</link><guid isPermaLink="true">https://forum.cloudron.io/post/110739</guid><dc:creator><![CDATA[marcusquinn]]></dc:creator><pubDate>Sun, 27 Jul 2025 03:20:40 GMT</pubDate></item><item><title><![CDATA[Reply to LiteLLM - OpenRouter Self-Hosted Alternative proxy provides access to OpenAI, Bedrock, Anthropic, Gemini, etc on Sat, 26 Jul 2025 18:23:02 GMT]]></title><description><![CDATA[<p dir="auto">Agreed. This would be a great app IMO.</p>
<p dir="auto">The ideal set &amp; forget self-hosted AI API routing setup and controller for all the other AI apps we will inevitably start to rely on.</p>
<p dir="auto">Could do with this to make LibreChat more useful in setup and minimising .yml maintenance, too:</p>
<ul>
<li><a href="https://forum.cloudron.io/topic/14034/starting-a-conversation">https://forum.cloudron.io/topic/14034/starting-a-conversation</a></li>
</ul>
]]></description><link>https://forum.cloudron.io/post/110731</link><guid isPermaLink="true">https://forum.cloudron.io/post/110731</guid><dc:creator><![CDATA[marcusquinn]]></dc:creator><pubDate>Sat, 26 Jul 2025 18:23:02 GMT</pubDate></item><item><title><![CDATA[Reply to LiteLLM - OpenRouter Self-Hosted Alternative proxy provides access to OpenAI, Bedrock, Anthropic, Gemini, etc on Sat, 18 Jan 2025 04:01:10 GMT]]></title><description><![CDATA[<p dir="auto">LiteLLM was earlier integrated into OpenWebUI and was marvellous.<br />
It was removed recently and I had posted about it here: <a href="https://forum.cloudron.io/topic/11957/litellm-removed-from-openwebui-requires-own-separate-container">https://forum.cloudron.io/topic/11957/litellm-removed-from-openwebui-requires-own-separate-container</a></p>
<p dir="auto">Now, having successfully tested the docker, wish it could be on Cloudron and can act as proxy for all of our AI applications that require API access to any AI provider.</p>
<p dir="auto">This would be fantastic - one application to rule them all!</p>
]]></description><link>https://forum.cloudron.io/post/100140</link><guid isPermaLink="true">https://forum.cloudron.io/post/100140</guid><dc:creator><![CDATA[jagan]]></dc:creator><pubDate>Sat, 18 Jan 2025 04:01:10 GMT</pubDate></item></channel></rss>