Back to Browse

Gradient descent optimization in C++ - Part 1 - Theory

4.2K views
Sep 7, 2020
23:27

Gradient descent optimization in C++ - Part 1 - Theory Gradient descent is a powerful numerical optimization technique that is used in many fields such as #machinelearning and #AI, indeed anywhere where we need to find a set of parameters that minimizes the output of an unknown function. In part 1 of this two-part series I focus on the theory behind the gradient descent method and give some examples. In the next video I will look at how to implement this technique in C++ code. In part 2 of this series, I show how I have approached implementing a simple gradient descent algorithm in C++. You can see that video here: https://youtu.be/eyCq3cNFpMU ********************************** You can also follow me on the QuantitativeBytes Facebook page at: www.facebook.com/QuantitativeBytes **********************************

Download

1 formats

Video Formats

360pmp429.9 MB

Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.

Gradient descent optimization in C++ - Part 1 - Theory | NatokHD