You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: lecture_2/README.md
+6-2
Original file line number
Diff line number
Diff line change
@@ -3,5 +3,9 @@
3
3
This lecture starts with a few primers and builds up to a full neural network that can solve a non-trivial problem.
4
4
5
5
*[Linear Primer](linear_primer.pdf) - We look at the nature of linearity and linear transformations
6
-
*[Numpy](1. numpy.ipynb) - We introduce the `jupyter notebook` and we look at numpy - the language spoken by tensorflow and pytorch (sort of).
7
-
*[Non-Linearity](3. Non Linearity.ipynb) - We identify
6
+
*[Numpy](1.%20numpy.ipynb) - We introduce the `jupyter notebook` and we look at numpy - the language spoken by tensorflow and pytorch (sort of).
7
+
*[Non-Linearity](3.%20Non%20Linearity.ipynb) - We identify the limitations of linear transforms and exploit non-linearity with an illuminating example.
8
+
*[Code - Linear-Separator](4.%20Code%20%20Linear%20Separator.ipynb) - We implement a small linear model in Keras.
9
+
*[Code - Non Linearity.ipynb](5.%20Code%20-%20Non%20Linearity.ipynb) - We express the non-linear thought experiment in code and look at the standard deep learning building blocks.
10
+
*[Code - Universality](6.%20Universality%20-%20XOR.ipynb) - We illustrate the universal nature of these building blocks by taking our existing code and training it on the XOR function.
11
+
*[Code - MNIST MLP.ipynb](7.%20Code%20-%20MNIST%20MLP.ipynb) - We take building blocks covered over a few toy problems and solve a real computer vision problem.
0 commit comments