Skip to content

Commit 4960229

Browse files
authoredJun 1, 2021
Merge pull request #3 from agustinaliagac/fix-typos
Fix formula typo in exercise 3 and other typos in exercise 5
2 parents 7f2e471 + c9e00ff commit 4960229

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed
 

‎Exercise3/exercise3.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -373,7 +373,7 @@
373373
"$$\n",
374374
"\\begin{align*}\n",
375375
"& \\frac{\\partial J(\\theta)}{\\partial \\theta_0} = \\frac{1}{m} \\sum_{i=1}^m \\left( h_\\theta\\left( x^{(i)} \\right) - y^{(i)} \\right) x_j^{(i)} & \\text{for } j = 0 \\\\\n",
376-
"& \\frac{\\partial J(\\theta)}{\\partial \\theta_0} = \\left( \\frac{1}{m} \\sum_{i=1}^m \\left( h_\\theta\\left( x^{(i)} \\right) - y^{(i)} \\right) x_j^{(i)} \\right) + \\frac{\\lambda}{m} \\theta_j & \\text{for } j \\ge 1\n",
376+
"& \\frac{\\partial J(\\theta)}{\\partial \\theta_j} = \\left( \\frac{1}{m} \\sum_{i=1}^m \\left( h_\\theta\\left( x^{(i)} \\right) - y^{(i)} \\right) x_j^{(i)} \\right) + \\frac{\\lambda}{m} \\theta_j & \\text{for } j \\ge 1\n",
377377
"\\end{align*}\n",
378378
"$$\n",
379379
"\n",

‎Exercise5/exercise5.ipynb

+2-2
Original file line numberDiff line numberDiff line change
@@ -687,9 +687,9 @@
687687
"\n",
688688
"### 3.2 Optional (ungraded) exercise: Adjusting the regularization parameter\n",
689689
"\n",
690-
"In this section, you will get to observe how the regularization parameter affects the bias-variance of regularized polynomial regression. You should now modify the the lambda parameter and try $\\lambda = 1, 100$. For each of these values, the script should generate a polynomial fit to the data and also a learning curve.\n",
690+
"In this section, you will get to observe how the regularization parameter affects the bias-variance of regularized polynomial regression. You should now modify the lambda parameter and try $\\lambda = 1, 100$. For each of these values, the script should generate a polynomial fit to the data and also a learning curve.\n",
691691
"\n",
692-
"For $\\lambda = 1$, the generated plots should look like the the figure below. You should see a polynomial fit that follows the data trend well (left) and a learning curve (right) showing that both the cross validation and training error converge to a relatively low value. This shows the $\\lambda = 1$ regularized polynomial regression model does not have the high-bias or high-variance problems. In effect, it achieves a good trade-off between bias and variance.\n",
692+
"For $\\lambda = 1$, the generated plots should look like the figure below. You should see a polynomial fit that follows the data trend well (left) and a learning curve (right) showing that both the cross validation and training error converge to a relatively low value. This shows the $\\lambda = 1$ regularized polynomial regression model does not have the high-bias or high-variance problems. In effect, it achieves a good trade-off between bias and variance.\n",
693693
"\n",
694694
"<table>\n",
695695
" <tr>\n",

0 commit comments

Comments
 (0)
Please sign in to comment.