LiteLLM removed from OpenWebUI, requires own separate container
-
It seems LiteLLM has been removed from OpenWebUI (Github thread) as an internal tool.
Now, the advice is to run LiteLLM separately in its own container.
Does anybody know the implications of this please?
Can we still use OpenAI compatible APIs?
What about OpenRouter API?A how to on migrating the LiteLLM to external LLM has been posted in the documentation.
-
Apparently, there is also this pipeline thing - https://github.com/open-webui/pipelines/tree/main/examples/pipelines/providers . https://github.com/open-webui/open-webui/issues/3288 has a tip to integrate with Anthropic using "functions"
-
LiteLLM is getting fairly mainstream imo (although things are moving very fast) and does more stuff than just accessing other models (load balancing, logging, spending checks…) so have a LiteLLM process started besides OpenWebUI would be useful.
In the meantime, I've followed this specific comment (https://github.com/open-webui/open-webui/issues/3288#issuecomment-2219524566) from the thread and I was able to get the anthropic/claude models with just configuration of openwebui, so that's already a bit of a win.