Welcome to video #3 in the Adaptive Experimentation series, presented by graduate student Sterling Baird (@sterling-baird ) at the 18th IEEE Conference on eScience in Salt Lake City, UT (Oct 10-14, 2022). In this video, Sterling summarizes the upcoming 10 tutorials and discusses the differences between Meta's Ax Loop, Service, and Developer APIs. He demonstrates how to use the Ax Loop API to perform Bayesian Optimization in a closed-loop fashion with an inexpensive function. In the next installment of this series, we will explore machine learning model tuning with constraints.
Github link to jupyter notebook https://github.com/sparks-baird/self-driving-lab-demo/blob/main/notebooks/escience/2.1-gpei_hartmann_loop.ipynb
previous video in series: https://youtu.be/41fQs4JxRQA
next video in series: https://youtu.be/9_gVsdexoJA
0:00 intro, summary of previous videos, and outline of upcoming tutorials
6:58 closed loop optimization of an inexpensive function (Loop vs Service vs Developer API)
9:10 evaluation function
10:45 running optimization
12:35 viewing and assessing optimization results