-
Notifications
You must be signed in to change notification settings - Fork 0
Neural Network
Vaibhav Vardhan edited this page Dec 5, 2019
·
1 revision
- Input Layer [18 input parameters]
- 1st Dense Layer [128 neurons] [ReLu activation]
- Dropout [0.2]
- 2nd Dense Layer [128 neurons] [ReLu activation]
- Dropout [0.2]
- 3rd Dense Layer [128 neurons] [ReLu activation]
- Dropout [0.2]
- Output Layer [3 output parameters] [Linear activation]
Loss function : Mean Square Error Optimizer : RMS propagation