Chat with your Github repositories, built with LangChain, Supabase, Next.js and Github's API. Uses gpt-3.5-turbo-0125 by default.
- Log in with Github to get provider token
- Use provider token to get all public repo metadata
- Load repo into Supabase
- Split code into chunks, embed file contents and store with associated metadata in Supabase (pgvector)
- Begin chat session with repo
- Load message history from existing chat session or create new one
- User asks a question
- Create vector store from existing index (repo was embedded beforehand)
- Build runnable sequence with langchain
- Load chat history
- Clean up original prompt through OpenAI (ie. makes the question more clear)
- Load context by embedding prompt and running cosine similarity search
- Run full chain and return stream
- Callback after chain end stores AI response in chat history DB table
NOTE: This repo is a proof of concept. Security features such as proper auth, RLS, and safe API key handling are not implemented. Don't deploy to production unless you like chaos.
cvega21/chat-with-your-code is built on the following main stack:
TypeScript – Languages
JavaScript – Languages
SQL – Languages
Shell – Languages
React – Javascript UI Libraries
Tailwind CSS – Front-End Frameworks
Next.js – Frameworks (Full Stack)
Supabase – Realtime Backend / API
LangChain – Large Language Model Tools
OpenAI – Large Language Models
Octokit – GitHub API SDK
Vercel AI SDK – Large Language Model Tools