AnythingLLM has had enormous improvements since it was first requested by @TheMoodBoardz
In particular, support for agents and RAG has been advanced to try and keep up with the huge demand in this area.
Installing AnythingLLM on GNU+Linux now automatically installs Ollama, too.
Here is a summary of the main advances. If you want to make RAG and ai agents easier, AnythingLLM would be a great application to support.
Comparison between v1.0Based.0 and v1.9.0 of AnythingLLM, here are the most significant changes categorized into key areas:
Agent System Overhaul
Complete redesign of the agent experience with real-time streaming tool calls
Agents can now download and ingest web files (PDF, Excel, CSV) during conversations
All providers and models support agentic streaming capabilities
Local LLM Integration
Added Microsoft Foundry Local integration for Windows/MacOS (free local LLM solution)
Linux now includes Ollama (0.11.4) built-in for immediate local LLM support
ARM64 support for both Linux and Windows platforms
Platform & Infrastructure
Expanded Linux support with automatic apparmor rules and .desktop file creation
NVIDIA NIM being phased out in favor of newer solutions
Upgraded core Electron version for improved stability
UI/UX Improvements
Model swap capability during chats (Ctrl/Cmd+Shift+L)
Workspace and thread searching in sidebar
System prompt version tracking and restoration
Enhanced mobile support with Android beta app
Developer Features
MCP (Model Context Protocol) compatibility
No-code AI agent builder
Enhanced API endpoints and developer tools
Multi-modal support for both open and closed-source LLMs
Internationalization
Added Portuguese and Estonian translations
Improved multi-language OCR support
Expanded localization coverage
The update represents a massive leap from the initial version, transforming from a basic document chat system to a comprehensive AI agent platform with robust local operation capabilities and extensive customization options.