AnythingLLM - AI business intelligence tool
-
- Main Site: https://useanything.com/
- Git: https://github.com/Mintplex-Labs/anything-llm
- Licence: MIT
- Docker - Yes
- Demo:



-
Yes please!!
-
Check out the developer here. Looks like a good guy. Isn't he good at making a technical presentation?
Lets support AnythingLLM on Cloudron! It is wonderful!
-
L LoudLemur referenced this topic on
-
-
I am using AnythingLLM on the desktop and it blows Open WebUI away for RAG and Agents!
If we can have AnythingLLM on Cloudron, it would be a gamechanger for groups to collaborate in a workspace with shared documents.@jagan said in AnythingLLM - AI business intelligence tool:
I am using AnythingLLM on the desktop and it blows Open WebUI away for RAG and Agents!
If we can have AnythingLLM on Cloudron, it would be a gamechanger for groups to collaborate in a workspace with shared documents.Maybe you could explain us how AnythingLLM "blows" OW away, and is this still relevant today? OW has much, much developed since then so would be good to see.
TO start with, this is an Desktop app and, yes, there's a "cloud" version however the only "cloud version" available is to be run from Docker (which should be easier for Cloudron to start with).
And also, viewing the capacity of the Desktop app how relevant would be the need to have a "cloud version" on Cloudron for example?
-
AnythingLLM has had enormous improvements since it was first requested by @TheMoodBoardz
In particular, support for agents and RAG has been advanced to try and keep up with the huge demand in this area.Installing AnythingLLM on GNU+Linux now automatically installs Ollama, too.
Here is a summary of the main advances. If you want to make RAG and ai agents easier, AnythingLLM would be a great application to support.
Comparison between v1.0Based.0 and v1.9.0 of AnythingLLM, here are the most significant changes categorized into key areas:
Agent System Overhaul
- Complete redesign of the agent experience with real-time streaming tool calls
- Agents can now download and ingest web files (PDF, Excel, CSV) during conversations
- All providers and models support agentic streaming capabilities
Local LLM Integration
- Added Microsoft Foundry Local integration for Windows/MacOS (free local LLM solution)
- Linux now includes Ollama (0.11.4) built-in for immediate local LLM support
- ARM64 support for both Linux and Windows platforms
Platform & Infrastructure
- Expanded Linux support with automatic apparmor rules and .desktop file creation
- NVIDIA NIM being phased out in favor of newer solutions
- Upgraded core Electron version for improved stability
UI/UX Improvements
- Model swap capability during chats (Ctrl/Cmd+Shift+L)
- Workspace and thread searching in sidebar
- System prompt version tracking and restoration
- Enhanced mobile support with Android beta app
Developer Features
- MCP (Model Context Protocol) compatibility
- No-code AI agent builder
- Enhanced API endpoints and developer tools
- Multi-modal support for both open and closed-source LLMs
Internationalization
- Added Portuguese and Estonian translations
- Improved multi-language OCR support
- Expanded localization coverage
The update represents a massive leap from the initial version, transforming from a basic document chat system to a comprehensive AI agent platform with robust local operation capabilities and extensive customization options.