Skip to content

Latest commit

 

History

History
6 lines (3 loc) · 641 Bytes

File metadata and controls

6 lines (3 loc) · 641 Bytes

An easy way to derive backward propagation for all kinds of neural networks

In practice most of the time we'll only need to provide the forward propagation and the frameworks will do the backward propagation for us, but in case you're interested in getting through all pieces of the neural networks by yourselves, I'd like to share some tricks that allow us to derive the backward propagations easily, for all kinds of neural networks (LSTM, CNN, etc.).

Check out this notebook backward_propagation_for_all.ipynb for all details. It's preferred to open it in Jupyter Notebook, as the Github website renders some equations very ugly.