Skip to content

Files

Latest commit

be2324b · Feb 19, 2018

History

History

lecture_2

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
Jan 30, 2018
Feb 19, 2018
Jan 30, 2018
Jan 30, 2018
Jan 30, 2018
Jan 30, 2018
Jan 30, 2018
Jan 31, 2018
Feb 19, 2018
Jan 30, 2018
Jan 30, 2018
Jan 30, 2018
Jan 30, 2018
Jan 30, 2018
Feb 19, 2018
Jan 30, 2018
Jan 30, 2018
Jan 30, 2018
Jan 30, 2018
Jan 30, 2018
Jan 30, 2018

Lecture 2 - Linearity, Non-linearity, Simple Networks

This lecture starts with a few primers and builds up to a full neural network that can solve a non-trivial problem.

  • Linear Primer - We look at the nature of linearity and linear transformations
  • Numpy - We introduce the jupyter notebook and we look at numpy - the language spoken by tensorflow and pytorch (sort of).
  • Non-Linearity - We identify the limitations of linear transforms and exploit non-linearity with an illuminating example.
  • Code - Linear-Separator - We implement a small linear model in Keras.
  • Code - Non Linearity.ipynb - We express the non-linear thought experiment in code and look at the standard deep learning building blocks.
  • Code - Universality - We illustrate the universal nature of these building blocks by taking our existing code and training it on the XOR function.
  • Code - MNIST MLP.ipynb - We take building blocks covered over a few toy problems and solve a real computer vision problem.