Skip to content

VatsaShivam/LLM-Guardrails

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM Guardrails

FastAPI demo app with a simple guardrails pipeline, Groq chat completions, and Qdrant-backed retrieval.

Requirements

  • Python 3.10+
  • Docker and Docker Compose
  • Groq API key

Environment

Create a local .env file from the example:

cp .env.example .env

Then set GROQ_API_KEY in .env.

For Docker Compose, keep:

QDRANT_HOST=qdrant
QDRANT_PORT=6333

For running the app directly on your machine, use:

QDRANT_HOST=localhost
QDRANT_PORT=6333

Run With Docker

docker compose up --build

The API will be available at:

http://localhost:8000

Run Locally

Start Qdrant:

docker compose up qdrant

Install dependencies and run FastAPI:

python -m venv .venv
.venv\Scripts\activate
pip install -r requirements.txt
uvicorn app.main:app --reload

API

Health check:

GET /

Chat:

POST /chat?user_input=What%20are%20AI%20guardrails%3F

Git Push Checklist

git init
git add .
git commit -m "Initial commit"
git branch -M main
git remote add origin <your-repository-url>
git push -u origin main

The real .env file is ignored so API keys are not pushed.

About

Input-output validator to secure the data and remove hallucination

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors