This project implements a neural network class without using any ML library.
.
|-- ANN/
| |-- Layer.py
| |-- Dense.py
| |-- Network.py
| |-- Activation.py
| |-- Activation_functions/
| |-- Tanh.py
| |-- Loss_functions/
| |-- MSE.py
|-- mnist_digits.py
-
ANN/: Directory for your neural network classes.-
Layer.py: Implementation of the BaseLayerclass. -
Dense.py: Implementation of theDenseclass. -
Network.py: Implementation of theNetworkclass. -
Activation.py: Implementation of theActivationclass. -
Activation_functions/: Contains activation function implementations.Tanh.py: Implementation of the hyperbolic tangent activation function.
-
Loss_functions/: Contains loss function implementations.MSE.py: Implementation of the Mean Squared Error loss function.
-
-
mnist_digits.py: Main script or application where you use the neural network.
-
Clone the repository:
git clone https://github.com/sudarshanmg/ANN.git
-
Install dependencies:
pip install -r requirements.txt
-
Run the MNIST digits script:
python mnist_digits.py
- Modify and run
mnist_digits.pyto experiment with the neural network on the MNIST dataset.
Head to the Tutorial Section.
Feel free to contribute to the development of this project. Create an issue or submit a pull request.
This project is licensed under the MIT License.