This lecture starts with a few primers and builds up to a full neural network that can solve a non-trivial problem.
- Linear Primer - We look at the nature of linearity and linear transformations
- Numpy - We introduce the
jupyter notebook
and we look at numpy - the language spoken by tensorflow and pytorch (sort of). - Non-Linearity - We identify the limitations of linear transforms and exploit non-linearity with an illuminating example.
- Code - Linear-Separator - We implement a small linear model in Keras.
- Code - Non Linearity.ipynb - We express the non-linear thought experiment in code and look at the standard deep learning building blocks.
- Code - Universality - We illustrate the universal nature of these building blocks by taking our existing code and training it on the XOR function.
- Code - MNIST MLP.ipynb - We take building blocks covered over a few toy problems and solve a real computer vision problem.