Back to Browse

Lecture 4 Part 2: Nonlinear Root Finding, Optimization, and Adjoint Gradient Methods

8.2K views
Oct 23, 2023
44:26

MIT 18.S096 Matrix Calculus For Machine Learning And Beyond, IAP 2023 Instructors: Alan Edelman, Steven G. Johnson View the complete course: https://ocw.mit.edu/courses/18-s096-matrix-calculus-for-machine-learning-and-beyond-january-iap-2023/ YouTube Playlist: https://www.youtube.com/playlist?list=PLUl4u3cNGP62EaLLH92E_VCN4izBKK6OE Description: Nonlinear root finding by Newton’s method and optimization by gradient descent. “Adjoint” methods (reverse-mode/backpropagation) lets us find gradients efficiently for large-scale engineering optimization. License: Creative Commons BY-NC-SA More information at https://ocw.mit.edu/terms More courses at https://ocw.mit.edu Support OCW at http://ow.ly/a1If50zVRlQ We encourage constructive comments and discussion on OCW’s YouTube and other social media channels. Personal attacks, hate speech, trolling, and inappropriate comments are not allowed and may be removed. More details at https://ocw.mit.edu/comments.

Download

1 formats

Video Formats

360pmp489.0 MB

Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.

Lecture 4 Part 2: Nonlinear Root Finding, Optimization, and Adjoint Gradient Methods | NatokHD