Back to Browse

06.1. Theory: Numerical Optimization in Machine Learning

52 views
Jul 16, 2025
1:32:45

Hey everyone, In today’s video, we’re gonna dive into some of the coolest optimization tricks in Machine Learning 🚀. We’ll cover: ✅ 0:00:00-Revise of previous lecture. ✅ 0:48:00-Gradient Descent (the workhorse of ML!) ✅ 1:13:00-(Quasi)Newton’s Method ✅ 1:21:00-Batch vs. Online learning ✅ 1:28:00-Overview of Big data optimization methods: Momentum, RMSProp, and Adam These are the tools that enable training large models. ❗This video is part of an open-science lecture series, so feel free to learn and share — but please respect the copyright. Let’s jump in and have some fun! 🙌

Download

1 formats

Video Formats

360pmp490.2 MB

Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.

06.1. Theory: Numerical Optimization in Machine Learning | NatokHD