RasaGPT - Headless LLM chatbot platform using ngrok + Telegram bot
-
Overview
RasaGPT is the first headless LLM chatbot platform built on top of Rasa and Langchain. It is boilerplate and a reference implementation of Rasa and Telegram utilizing an LLM library like Langchain for indexing, retrieval and context injection.
- Resources: https://rasagpt.dev
- 🧑 Github: https://github.com/paulpierre/RasaGPT
- 🧙 Author: @paulpierre
What is Rasa?
In their own words:
Rasa is an open source (Python) machine learning framework to automate text- and voice-based conversations: NLU, dialogue management, connect to Slack, Facebook, and more - Create chatbots and voice assistants
In my words:
Rasa is a very popular (dare I say de facto?) and easy-enough to use chatbot framework with built in NLU ML pipelines that are obsolete and a conceptual starting point for a reimagined chatbot framework in a world of LLMs.
️ Why RasaGPT?
RasaGPT works out of the box. A lot of the implementing headaches were sorted out so you don’t have to, including:
- Creating your own proprietary bot end-point using FastAPI, document upload and “training” 'pipeline included
- How to integrate Langchain/LlamaIndex and Rasa
- Library conflicts with LLM libraries and passing metadata
- Dockerized support on MacOS for running Rasa
- Reverse proxy with chatbots via ngrok
- Implementing pgvector with your own custom schema instead of using Langchain’s highly opinionated PGVector class
- Adding multi-tenancy (Rasa doesn't natively support this), sessions and metadata between Rasa and your own backend / application
The backstory is familiar. A friend came to me with a problem. I scoured Google and Github for a decent reference implementation of LLM’s integrated with Rasa but came up empty-handed. I figured this to be a great opportunity to satiate my curiosity and 2 days later I had a proof of concept, and a week later this is what I came up with.