Skip to content

thedatamonk/LexPodcast-LLM

Repository files navigation

LexPodcast-LLM

Ask LLM anything about Lex Fridman Podcast videos on Youtube

Setup

1. Create virtual environment

python -m venv venv
source venv/bin/activate

2. Install dependencies

pip install -r requirements.txt

3. Setup local instance of Qdrant The simplest way to do this is docker. Run the following commands in the terminal

docker pull qdrant/qdrant
docker run -p 6333:6333 qdrant/qdrant

4. Setup configuration variables in .env file

  1. Rename .env.example file to .env.
  2. In the current version of lexllm, the only mandatory environment variable to specify is the CHATNBX_KEY and OPENAI_API_KEY.
  3. CHATNBX_KEY is used to invoke the ChatNBX API. This API is used to generate answers.
  4. OPENAI_API_KEY is used to invoke the OpenAI Embeddings API. This API is used to convert the query into an embedding that in turn is used to fetch relevant documents from Qdrant vectorDB.
  5. QDRANT_CLOUD_KEY and QDRANT_DB_URL are optional since we will be storing the embeddings in a local instance of Qdrant. I was facing some issues (ERROR 403) while creating a collection in Qdrant cloud DB.

5. Store embeddings in Qdrant DB

  1. Inside embed.py, you can change the value of CREATE_EMBEDDING to True, if you want to recreate embeddings.
  2. Since, embeddings are already provided, I recommend to keep it to unchanged.
python embed.py

Ask queries

  1. In order to interact with the chatbot, I have provided a notebook called chat.ipynb.
  2. It also contains some example questions that can be asked to the chatbot.

About

Ask LLM anything about Lex Fridman Podcast videos on Youtube

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published