Chat-with-Github-Repo
-
- https://github.com/peterw/Chat-with-Github-Repo
- https://twitter.com/pwang_szn/status/1650801868568772608
- https://twitter.com/i/status/1650801915918434304
- Clones a git repository
- Loading documents from the repository
- Splitting the documents into smaller text chunks
- We add those text chunks to a DeepLake instance with their computed embeddings using OpenAI Embeddings.
Then have a main function that calls everything. - Now that we've embedded all the data, we can access the data from
@activeloopai
One thing with Deeplake is that you can also store metadata w/ your embeddings if you need to.
At a high-level, here's what's going on:
The DeepLake instance is loaded with embeddings from the dataset that we have in DeepLake.
The chatbot searches the dataset for relevant information and generates responses using GPT-3.5-turbo based on the user's input.