Skip to content

Longman-max/dcgan-mnist

Repository files navigation

dcgan-mnist

GANs with TensorFlow on MNIST


Table of Contents


Overview

This repository is a hands-on implementation of a Deep Convolutional GAN (DCGAN) using TensorFlow, trained on the MNIST dataset of handwritten digits.
It is designed for learning purposes, showing how a Generator and Discriminator can be trained adversarially to produce realistic-looking digit images.


Training Results

Training Results from basic_gan.ipynb

Sample outputs from training over 5 epochs:

Epoch Discriminator Loss Generator Loss Sample Output
1 0.6533 1.3067
2 1.1693 1.0325
3 1.1358 1.1644
4 0.9302 1.0952
5 0.9230 1.3392

Training Results from advanced_gan.ipynb

Sample outputs from training over 5 epochs:

Epoch Discriminator Loss Generator Loss Sample Output
1 1.3690 0.9053
2 1.3009 0.9084
3 1.3500 0.9443
4 1.3614 0.9069
5 1.3124 0.9217

My Observations

Training required between 30–60 GB of RAM and took around 1–3 hours depending on system load and configuration.
On my run, it took ~62 minutes.
For cloud experiments, renting an EC2 high-memory instance at spot pricing would keep training costs around $1 per run.


OS and Hardware

  • OS (my setup): Windows 11 Pro

  • Hardware (my setup): My Almighty HP EliteBook 840 G3

    • Intel Core i5 (6th Gen)
    • 16 GB RAM
    • Integrated Intel HD Graphics 520 (no CUDA support → training was CPU-bound and slower)
  • Limitations Encountered:

    • CPU-only training significantly increased runtime.
    • Limited RAM availability (16 GB) constrained batch sizes and prolonged convergence.
    • No dedicated GPU → unsuitable for scaling to larger datasets or higher-resolution GANs.
  • Alternative Setup (recommended for faster training):
    Ubuntu 12.10 — Amazon EC2 Instance — High-Memory Quadruple Extra Large (m2.4xlarge).
    This environment offers sufficient memory and GPU acceleration for practical GAN training.


About

GANs with TensorFlow on MNIST

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published