Skip to content

Commit 47a2000

Browse files
committed
removed requirements file
2 parents 2fc7b44 + eaf6822 commit 47a2000

File tree

6 files changed

+22
-14
lines changed

6 files changed

+22
-14
lines changed

Exercise1/exercise1.ipynb

+3-3
Original file line numberDiff line numberDiff line change
@@ -168,7 +168,7 @@
168168
"\n",
169169
"Execute the next cell to grade your solution to the first part of this exercise.\n",
170170
"\n",
171-
"*You should now submit you solutions.*"
171+
"*You should now submit your solutions.*"
172172
]
173173
},
174174
{
@@ -508,7 +508,7 @@
508508
" X : array_like\n",
509509
" The input dataset of shape (m x n+1).\n",
510510
" \n",
511-
" y : arra_like\n",
511+
" y : array_like\n",
512512
" Value at given features. A vector of shape (m, ).\n",
513513
" \n",
514514
" theta : array_like\n",
@@ -845,7 +845,7 @@
845845
"cell_type": "markdown",
846846
"metadata": {},
847847
"source": [
848-
"*You should not submit your solutions.*"
848+
"*You should now submit your solutions.*"
849849
]
850850
},
851851
{

Exercise4/exercise4.ipynb

+5-1
Original file line numberDiff line numberDiff line change
@@ -664,6 +664,7 @@
664664
"Note that the symbol $*$ performs element wise multiplication in `numpy`.\n",
665665
"\n",
666666
"1. Accumulate the gradient from this example using the following formula. Note that you should skip or remove $\\delta_0^{(2)}$. In `numpy`, removing $\\delta_0^{(2)}$ corresponds to `delta_2 = delta_2[1:]`.\n",
667+
"$$ \\Delta^{(l)} = \\Delta^{(l)} + \\delta^{(l+1)} (a^{(l)})^{(T)} $$\n",
667668
"\n",
668669
"1. Obtain the (unregularized) gradient for the neural network cost function by dividing the accumulated gradients by $\\frac{1}{m}$:\n",
669670
"$$ \\frac{\\partial}{\\partial \\Theta_{ij}^{(l)}} J(\\Theta) = D_{ij}^{(l)} = \\frac{1}{m} \\Delta_{ij}^{(l)}$$\n",
@@ -672,7 +673,10 @@
672673
"**Python/Numpy tip**: You should implement the backpropagation algorithm only after you have successfully completed the feedforward and cost functions. While implementing the backpropagation alogrithm, it is often useful to use the `shape` function to print out the shapes of the variables you are working with if you run into dimension mismatch errors.\n",
673674
"</div>\n",
674675
"\n",
675-
"[Click here to go back and update the function `nnCostFunction` with the backpropagation algorithm](#nnCostFunction)."
676+
"[Click here to go back and update the function `nnCostFunction` with the backpropagation algorithm](#nnCostFunction).\n",
677+
"\n",
678+
"\n",
679+
"**Note:** If the iterative solution provided above is proving to be difficult to implement, try implementing the vectorized approach which is easier to implement in the opinion of the moderators of this course. You can find the tutorial for the vectorized approach [here](https://www.coursera.org/learn/machine-learning/discussions/all/threads/a8Kce_WxEeS16yIACyoj1Q)."
676680
]
677681
},
678682
{

Exercise5/exercise5.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -649,7 +649,7 @@
649649
"metadata": {},
650650
"outputs": [],
651651
"source": [
652-
"lambda_ = 100\n",
652+
"lambda_ = 0\n",
653653
"theta = utils.trainLinearReg(linearRegCostFunction, X_poly, y,\n",
654654
" lambda_=lambda_, maxiter=55)\n",
655655
"\n",

Exercise7/exercise7.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@
5252
"# library written for this exercise providing additional functions for assignment submission, and others\n",
5353
"import utils\n",
5454
"\n",
55-
"%load_ext autoreload \n",
55+
"%load_ext autoreload\n",
5656
"%autoreload 2\n",
5757
"\n",
5858
"# define the submission/grader object for this exercise\n",

Exercise8/exercise8.ipynb

+3-3
Original file line numberDiff line numberDiff line change
@@ -468,7 +468,7 @@
468468
"\n",
469469
"# From the matrix, we can compute statistics like average rating.\n",
470470
"print('Average rating for movie 1 (Toy Story): %f / 5' %\n",
471-
" np.mean(Y[0, R[0, :]]))\n",
471+
" np.mean(Y[0, R[0, :] == 1]))\n",
472472
"\n",
473473
"# We can \"visualize\" the ratings matrix by plotting it with imshow\n",
474474
"pyplot.figure(figsize=(8, 8))\n",
@@ -683,7 +683,7 @@
683683
"\n",
684684
"$$ \\frac{\\partial J}{\\partial x_k^{(i)}} = \\sum_{j:r(i,j)=1} \\left( \\left(\\theta^{(j)}\\right)^T x^{(i)} - y^{(i,j)} \\right) \\theta_k^{(j)} $$\n",
685685
"\n",
686-
"$$ \\frac{\\partial J}{\\partial \\theta_k^{(j)}} = \\sum_{i:r(i,j)=1} \\left( \\left(\\theta^{(j)}\\right)^T x^{(i)}- y^{(i,j)} \\right) x_k^{(j)} $$\n",
686+
"$$ \\frac{\\partial J}{\\partial \\theta_k^{(j)}} = \\sum_{i:r(i,j)=1} \\left( \\left(\\theta^{(j)}\\right)^T x^{(i)}- y^{(i,j)} \\right) x_k^{(i)} $$\n",
687687
"\n",
688688
"Note that the function returns the gradient for both sets of variables by unrolling them into a single vector. After you have completed the code to compute the gradients, the next cell run a gradient check\n",
689689
"(available in `utils.checkCostFunction`) to numerically check the implementation of your gradients (this is similar to the numerical check that you used in the neural networks exercise. If your implementation is correct, you should find that the analytical and numerical gradients match up closely.\n",
@@ -809,7 +809,7 @@
809809
"\n",
810810
"$$ \\frac{\\partial J}{\\partial x_k^{(i)}} = \\sum_{j:r(i,j)=1} \\left( \\left(\\theta^{(j)}\\right)^T x^{(i)} - y^{(i,j)} \\right) \\theta_k^{(j)} + \\lambda x_k^{(i)} $$\n",
811811
"\n",
812-
"$$ \\frac{\\partial J}{\\partial \\theta_k^{(j)}} = \\sum_{i:r(i,j)=1} \\left( \\left(\\theta^{(j)}\\right)^T x^{(i)}- y^{(i,j)} \\right) x_k^{(j)} + \\lambda \\theta_k^{(j)} $$\n",
812+
"$$ \\frac{\\partial J}{\\partial \\theta_k^{(j)}} = \\sum_{i:r(i,j)=1} \\left( \\left(\\theta^{(j)}\\right)^T x^{(i)}- y^{(i,j)} \\right) x_k^{(i)} + \\lambda \\theta_k^{(j)} $$\n",
813813
"\n",
814814
"This means that you just need to add $\\lambda x^{(i)}$ to the `X_grad[i,:]` variable described earlier, and add $\\lambda \\theta^{(j)}$ to the `Theta_grad[j, :]` variable described earlier.\n",
815815
"\n",

README.md

+9-5
Original file line numberDiff line numberDiff line change
@@ -3,17 +3,21 @@
33

44
![](machinelearning.jpg)
55

6-
This repositry contains the python versions of the programming assignments for the [Machine Learning online class](https://www.coursera.org/learn/machine-learning) taught by Professor Andrew Ng. This is perhaps the most popular introductory online machine learning class. In addition to being popular, it is also one of the best Machine learning classes any interested student can take to get started with machine learning. An unfortunate aspect of this class is that the programming assignments are in MATLAB or OCTAVE, probably because this class was made before python become the go-to language in machine learning.
6+
This repositry contains the python versions of the programming assignments for the [Machine Learning online class](https://www.coursera.org/learn/machine-learning) taught by Professor Andrew Ng. This is perhaps the most popular introductory online machine learning class. In addition to being popular, it is also one of the best Machine learning classes any interested student can take to get started with machine learning. An unfortunate aspect of this class is that the programming assignments are in MATLAB or OCTAVE, probably because this class was made before python became the go-to language in machine learning.
77

8-
The Python machine learning ecosystem has grown exponentially in the past few years, and still gaining momentum. I suspect that many students who want to get started with their machine learning journey would like to start it with Python also. It is for those reasons I have decided to re-write all the programming assignments in Python, so students can get acquainted with its ecosystem from the start of their learning journey.
8+
The Python machine learning ecosystem has grown exponentially in the past few years, and is still gaining momentum. I suspect that many students who want to get started with their machine learning journey would like to start it with Python also. It is for those reasons I have decided to re-write all the programming assignments in Python, so students can get acquainted with its ecosystem from the start of their learning journey.
99

1010
These assignments work seamlessly with the class and do not require any of the materials published in the MATLAB assignments. Here are some new and useful features for these sets of assignments:
1111

1212
- The assignments use [Jupyter Notebook](http://jupyter-notebook-beginner-guide.readthedocs.io/en/latest/what_is_jupyter.html), which provides an intuitive flow easier than the original MATLAB/OCTAVE assignments.
1313
- The original assignment instructions have been completely re-written and the parts which used to reference MATLAB/OCTAVE functionality have been changed to reference its `python` counterpart.
1414
- The re-written instructions are now embedded within the Jupyter Notebook along with the `python` starter code. For each assignment, all work is done solely within the notebook.
1515
- The `python` assignments can be submitted for grading. They were tested to work perfectly well with the original Coursera grader that is currently used to grade the MATLAB/OCTAVE versions of the assignments.
16-
- After each part of a given assignment, the Jupyter Notebook contains a cell which prompts the user for submitting the current part of the assignment for grading.
16+
- After each part of a given assignment, the Jupyter Notebook contains a cell which prompts the user for submitting the current part of the assignment for grading.
17+
18+
## Online workspace
19+
20+
You can work on the assignments in an online workspace called [Deepnote](https://www.deepnote.com/). This allows you to play around with the code and access the assignments from your browser. [<img height="22" src="https://beta.deepnote.com/buttons/launch-in-deepnote.svg">](https://beta.deepnote.com/launch?template=data-science&url=https%3A%2F%2Fgithub.com%2Fdibgerge%2Fml-coursera-python-assignments)
1721

1822
## Downloading the Assignments
1923

@@ -24,7 +28,7 @@ To get started, you can start by either downloading a zip file of these assignme
2428
Each assignment is contained in a separate folder. For example, assignment 1 is contained within the folder `Exercise1`. Each folder contains two files:
2529
- The assignment `jupyter` notebook, which has a `.ipynb` extension. All the code which you need to write will be written within this notebook.
2630
- A python module `utils.py` which contains some helper functions needed for the assignment. Functions within the `utils` module are called from the python notebook. You do not need to modify or add any code to this file.
27-
31+
2832
## Requirements
2933

3034
These assignments has been tested and developed using the following libraries:
@@ -96,4 +100,4 @@ If you are new to python and to `jupyter` notebooks, no worries! There is a plet
96100

97101
- I would like to thank professor Andrew Ng and the crew of the Stanford Machine Learning class on Coursera for such an awesome class.
98102

99-
- Some of the material used, especially the code for submitting assignments for grading is based on [`mstampfer`'s](https://github.com/mstampfer/Coursera-Stanford-ML-Python) python implementation of the assignments.
103+
- Some of the material used, especially the code for submitting assignments for grading is based on [`mstampfer`'s](https://github.com/mstampfer/Coursera-Stanford-ML-Python) python implementation of the assignments.

0 commit comments

Comments
 (0)