A fully local, privacy-first SDLC assistant that helps you manage, visualize, and document all phases of the Software Development Life Cycle using locally hosted Hugging Face models. No OpenAI or external API calls.
- AI SDLC Chat Assistant (local model)
- SDLC Phase Workspace (Requirements, Design, Implementation, Testing, Deployment, Maintenance)
- AI-generated documentation per phase + improvement suggestions
- Visualization dashboard (progress, time allocation, done vs pending)
- Export full report or per-phase to PDF
- Offline mode support (HF_HUB_OFFLINE)
- Backend/Inference: Python + Hugging Face
transformers - UI: Streamlit
- Visuals: Plotly
- PDF: reportlab
SDLC huggingface/
├─ app.py
├─ utils.py
├─ requirements.txt
├─ README.md
└─ models/ # place offline models here (optional)
- Create and activate a virtual environment (recommended).
- Install Python dependencies:
pip install -r requirements.txt
- Install PyTorch per your environment (CUDA or CPU):
# Example (CUDA 12.1): pip install torch --index-url https://download.pytorch.org/whl/cu121 # Or CPU only: pip install torch --index-url https://download.pytorch.org/whl/cpu
- (Optional) Prepare models for fully offline usage:
- Download a model with
huggingface-cliorgit lfson a machine with internet and copy intomodels/. - Suggested local folders:
models/mistral/containingmistralai/Mistral-7B-Instruct-v0.2models/phi3/containingmicrosoft/Phi-3-mini-4k-instruct
- Or rely on your local HF cache if pre-populated.
- Download a model with
- Toggle "Offline mode (HF_HUB_OFFLINE)" in the sidebar to enforce offline inference.
- Ensure the model is available locally (either in
models/or your HF cache). When offline is enabled, no network calls are attempted.
streamlit run app.pyThen open the provided local URL in your browser.
- From the sidebar, click "Load/Reload Model".
- Preferred Model options:
- Auto (picks Mistral if RAM ≥ 16GB, else Phi-3 Mini)
models/local-mistral(alias formodels/mistralif present)models/local-phi3(alias formodels/phi3if present)- Direct HF IDs (if cached locally)
- If
bitsandbytesis available, 4-bit quantization will be used when possible.
- Use the sidebar to export a full SDLC report.
- Or export per-phase from the phase tab.
- PDFs are written to the
outputs/folder.
bitsandbytesmay not always be available or stable; the app will still run without it (using CPU or CUDA as available).- If VRAM/RAM is limited, prefer
microsoft/Phi-3-mini-4k-instruct.
- No external APIs are called. In offline mode, no network calls are made at all.
- All data remains on your machine.
- Add voice input via local Whisper (e.g.,
faster-whisper) and a small audio recorder widget. - Add risk matrix and Gantt charts (Plotly).
- Integrate with local Git to show commit history per phase.