Skip to content

Latest commit

 

History

History
5 lines (5 loc) · 579 Bytes

README.md

File metadata and controls

5 lines (5 loc) · 579 Bytes

nanoGPT

language generation - Shakespeare

The nanoGPT folder contains an implementation of GPT completly from scratch using just numpy & torch.

  1. The first use case explored here is language generation. Specifically, the tiny Shakespeare dataset. Ablation studies are carried out to determine the relative importance of different components of the Transformer architecture using the perfromance of the model.
  2. Text Generation using this model is compared against ChatGPT, GPT-2 & Falcon7B-Instruct. The GPT-2 model is fine tuned first on the Shakespeare dataset.