Skip to content

rabio10/neural-network-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

neural-network-from-scratch

This is my try in implementing a deep neural network library from scratch.

  • I used MNIST datatset of handwritten digits to test the implementation. in the main file, I tested with a network of 2 hidden layers of 8 neurons and an output layer of 10 neurons to give the result of the prediction.
  • As an optimizer, I used Stochastic Gradient Descent with dividing the training data to mini batches.
  • For the Activation Functions, I implemented the possibility to choose between having Sigmoid function for the whole network layers, or ReLu for hidden layers with Softmax for the output layer. But this version only works with the Sigmoid option for now.

How to use it :

  • in main.py file, specify the size of each layer depending on your needs in a list like this :

layers_sizes = [ 28*28, 8, 8, 10] # 28*28 is the input layer, 10 is the output layer

  • Load the data from any source you want ; i loaded the MNIST data :

from keras.datasets import mnist (train_X, train_y), (test_X, test_y) = mnist.load_data()

  • transform your data into a list of tuples (x,y) (datapoints)
  • instantiate The class Network with the layers sizes specified above

net = Network(layers_sizes)

  • use the train method to train the model :

net.train(epoch, train_data, learning_rate, test_data=None, mini_batch_size=32)

Improvements

To be Done.

About

This is my try in implementing a deep neural network library from scratch.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages