Back to Browse

Random Search for Hyper-parameter Optimization

10.4K views
Jul 26, 2019
11:40

In this video, I'll show you how random search performs about as well as grid search with less number of iterations. Random search is a technique where random combinations of the hyperparameters are used to find the best solution for the built model. It is similar to grid search, and yet it has proven to yield better results comparatively. Instead of searching over the entire grid, random search only evaluates a random sample of points on the grid. This makes random search a lot cheaper than grid search. Random search wasn’t taken very seriously before. This is because it doesn’t search over all the grid points, so it cannot possibly beat the optimum found by grid search. But then along came Bergstra and Bengio. They showed that, in surprisingly many instances, random search performs about as well as grid search. All in all, trying 60 random points sampled from the grid seems to be good enough. Link to the Notebook - https://github.com/bhattbhavesh91/decision_tree_grid_search/blob/master/RandomSearch_Implementation.ipynb If you do have any questions with what we covered in this video then feel free to ask in the comment section below & I'll do my best to answer those. If you enjoy these tutorials & would like to support them then the easiest way is to simply like the video & give it a thumbs up & also it's a huge help to share these videos with anyone who you think would find them useful. Please consider clicking the SUBSCRIBE button to be notified for future videos & thank you all for watching. You can find me on: GitHub - https://github.com/bhattbhavesh91 Medium - https://medium.com/@bhattbhavesh91 Reference to the original article : https://www.oreilly.com/ideas/evaluating-machine-learning-models/page/5/hyperparameter-tuning #RandomSearch #GridSearch #HyperparameterTuning

Download

0 formats

No download links available.

Random Search for Hyper-parameter Optimization | NatokHD