Skip to content

Neural Network

Vaibhav Vardhan edited this page Dec 5, 2019 · 1 revision

Network Summary

  1. Input Layer [18 input parameters]
  2. 1st Dense Layer [128 neurons] [ReLu activation]
  3. Dropout [0.2]
  4. 2nd Dense Layer [128 neurons] [ReLu activation]
  5. Dropout [0.2]
  6. 3rd Dense Layer [128 neurons] [ReLu activation]
  7. Dropout [0.2]
  8. Output Layer [3 output parameters] [Linear activation]

Loss function : Mean Square Error Optimizer : RMS propagation

Wiki

Model Details

Game Details

Clone this wiki locally