-
-
Notifications
You must be signed in to change notification settings - Fork 53
[newton_method] Update lecture code and fix typos #564
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Many thanks, @kp992! Great changes!
In this lecture, we used autograd
because we were trying to avoid using JAX in the intermediate series. Since we are now migrating to JAX, I think we can use jax.jacobian
here by adapting some functions from this lecture:
https://jax.quantecon.org/newtons_method.html
Let me know if that sounds good to you!
(CC @mmcky)
Thanks @HumphreyYang for the review. I was wondering if we want to keep just a single lecture on Newton's method using JAX or if we are planning to keep two versions -- one here and the other in JAX series? |
Hi @kp992,
Yes! Our plan is to replace them with JAX implementation and keep the old JAX series and rename it to 'GPU computing for computational economics'. |
Thanks @HumphreyYang for the clarification. I will update the lecture to use JAX then. |
So basically, we should host the same lecture in this series too, right? I mean we can copy over the JAX lecture here and update the code with styling fixes and typos? |
Hi @kp992, yes, but I recall that for some lectures, the JAX version we wrote is a simplified version without discussions. For code that has overlaps, I think we can copy things over (like the |
The lecture is now updated to optimized version of JAX and removed numpy/autograd support. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR modernizes the Newton's method lecture by updating the code to follow current best practices, fixing typos, and correcting mathematical notation errors. The changes improve code quality and educational clarity while maintaining the lecture's pedagogical structure.
Key changes:
- Migrates from deprecated
autograd
tojax
for automatic differentiation - Updates from
collections.namedtuple
totyping.NamedTuple
for better type support - Corrects mathematical notation and fixes various typos throughout
lectures/newton_method.md
Outdated
``` | ||
|
||
```{code-cell} ipython3 | ||
k_star_approx_newton | ||
``` | ||
|
||
The result confirms the descent we saw in the graphs above: a very accurate result is reached with only 5 iterations. | ||
The result confirms the convergence we saw in the graphs above: a very accurate result is reached with only 5 iterations. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The result confirms the convergence we saw in the graphs above: a very accurate result is reached with only 5 iterations. | |
The result confirms convergence we saw in the graphs above: a very accurate result is reached with only 5 iterations. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice work @kp992 -- I have left some minor comments for you. Mainly just minor adjustments for pep8 etc.
Thanks for the review @mmcky. |
This PR:
typing.NamedTuple
jax