From the course: Full-Stack Deep Learning with Python (2024)

Unlock this course with a free trial

Join today to access over 25,300 courses taught by industry experts.

Hyperparameter optimization with Hyperopt and MLflow

Hyperparameter optimization with Hyperopt and MLflow - Python Tutorial

From the course: Full-Stack Deep Learning with Python (2024)

Hyperparameter optimization with Hyperopt and MLflow

- [Instructor] Now that we've set up the objective function, which returns the metrics that has to be minimized during the process of hyper parameter optimization, we can start the hyper parameter tuning process. Notice that we perform hyper parameter tuning within an outer ML Flow run. I call with mlflow.start_run. All of the runs used to track the individual model trainings metrics and parameters will be nested within this outer run. Now the way you perform hyper parameter tuning using hyperop is by using this fmin function. The fmin function is responsible for executing the actual hyper parameter optimization process and searching for the best set of hyper parameters that minimize our objective function. The function that fmin will run over and over again to train the different models, which are part of this optimization process, is the function that we pass in as an input argument on line three, the train_emnist function.…

Contents