One of the first obsessions I've had was Neural Networks and Machine Learning. I've learned how to implement Neural Networks using libraries such as TensorFlow or PyTorch. However, through using these libraries, I lacked a deeper understanding on how each component of a Neural Network interconnect to make a model that can predict based off of data. As a result, I came up with the idea of recreating a Neural Network from scratch.
To make this project, I first planned out each component I would try to recreate: activation layers, loss/cost functions, gradient descent, and more. Then I translated the math into code and organized it for easy use. The Neural Network is implemented in a list-like manner, where each layer can be added sequentially.
Creating a Neural Network from scratch is a big undertaking, as having a greater understanding of how they work could be a daunting task. As expected, this project required a lot of research and hours spent learning about each component. Additionally, creating this Neural Network architecture required an understanding of multivariable calculus as we had to implement gradient descent which requires the use of partial derivatives.
- NumPY
- Python
git clone https://github.com/aakashvishcoder/NeuralNetworkFromScratch.git
cd your-repo