LocalAI on Cloudron: OpenAI compatible API to run LLM (Large language Models) models locally on consumer grade hardware
-
"Self-hosted, community-driven, local OpenAI-compatible API. Drop-in replacement for OpenAI running LLMs on consumer-grade hardware. LocalAI is a RESTful API to run ggml compatible models: llama.cpp, alpaca.cpp, gpt4all.cpp, rwkv.cpp, whisper.cpp, vicuna, koala, gpt4all-j, cerebras and many others"
MIT Licence
There is a Dockerhttps://github.com/go-skynet/LocalAI
https://nitter.net/LocalAI_APIExample setup with chatbot-ui
https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-uiExample setup with GPT4All
https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-ui -
Much prefer Chatbot UI to to OpenAI's own interface.