LiteLLM removed from OpenWebUI, requires own separate container
-
It seems LiteLLM has been removed from OpenWebUI (Github thread) as an internal tool.
Now, the advice is to run LiteLLM separately in its own container.
Does anybody know the implications of this please?
Can we still use OpenAI compatible APIs?
What about OpenRouter API?A how to on migrating the LiteLLM to external LLM has been posted in the documentation.
-
Apparently, there is also this pipeline thing - https://github.com/open-webui/pipelines/tree/main/examples/pipelines/providers . https://github.com/open-webui/open-webui/issues/3288 has a tip to integrate with Anthropic using "functions"
-
LiteLLM is getting fairly mainstream imo (although things are moving very fast) and does more stuff than just accessing other models (load balancing, logging, spending checks…) so have a LiteLLM process started besides OpenWebUI would be useful.
In the meantime, I've followed this specific comment (https://github.com/open-webui/open-webui/issues/3288#issuecomment-2219524566) from the thread and I was able to get the anthropic/claude models with just configuration of openwebui, so that's already a bit of a win. -
J jagan referenced this topic on
Hello! It looks like you're interested in this conversation, but you don't have an account yet.
Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.
With your input, this post could be even better 💗
Register Login