LocalAI on Cloudron: OpenAI compatible API to run LLM (Large language Models) models locally on consumer grade hardware
-
"Self-hosted, community-driven, local OpenAI-compatible API. Drop-in replacement for OpenAI running LLMs on consumer-grade hardware. LocalAI is a RESTful API to run ggml compatible models: llama.cpp, alpaca.cpp, gpt4all.cpp, rwkv.cpp, whisper.cpp, vicuna, koala, gpt4all-j, cerebras and many others"
MIT Licence
There is a Dockerhttps://github.com/go-skynet/LocalAI
https://nitter.net/LocalAI_APIExample setup with chatbot-ui
https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-uiExample setup with GPT4All
https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-ui -
Much prefer Chatbot UI to to OpenAI's own interface.
-
LocalAI is now on v2.27.0 and there have been updates to their All in One images
Hello! It looks like you're interested in this conversation, but you don't have an account yet.
Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.
With your input, this post could be even better 💗
Register Login