Skip to content

Latest commit

 

History

History
11 lines (9 loc) · 1.19 KB

README.md

File metadata and controls

11 lines (9 loc) · 1.19 KB

Lecture 2 - Linearity, Non-linearity, Simple Networks

This lecture starts with a few primers and builds up to a full neural network that can solve a non-trivial problem.

  • Linear Primer - We look at the nature of linearity and linear transformations
  • Numpy - We introduce the jupyter notebook and we look at numpy - the language spoken by tensorflow and pytorch (sort of).
  • Non-Linearity - We identify the limitations of linear transforms and exploit non-linearity with an illuminating example.
  • Code - Linear-Separator - We implement a small linear model in Keras.
  • Code - Non Linearity.ipynb - We express the non-linear thought experiment in code and look at the standard deep learning building blocks.
  • Code - Universality - We illustrate the universal nature of these building blocks by taking our existing code and training it on the XOR function.
  • Code - MNIST MLP.ipynb - We take building blocks covered over a few toy problems and solve a real computer vision problem.