From the course: Full-Stack Deep Learning with Python (2024)
Unlock this course with a free trial
Join today to access over 25,300 courses taught by industry experts.
Hyperparameter optimization with Hyperopt and MLflow - Python Tutorial
From the course: Full-Stack Deep Learning with Python (2024)
Hyperparameter optimization with Hyperopt and MLflow
- [Instructor] Now that we've set up the objective function, which returns the metrics that has to be minimized during the process of hyper parameter optimization, we can start the hyper parameter tuning process. Notice that we perform hyper parameter tuning within an outer ML Flow run. I call with mlflow.start_run. All of the runs used to track the individual model trainings metrics and parameters will be nested within this outer run. Now the way you perform hyper parameter tuning using hyperop is by using this fmin function. The fmin function is responsible for executing the actual hyper parameter optimization process and searching for the best set of hyper parameters that minimize our objective function. The function that fmin will run over and over again to train the different models, which are part of this optimization process, is the function that we pass in as an input argument on line three, the train_emnist function.…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
-
(Locked)
Preparing data for image classification using CNN4m 2s
-
(Locked)
Configuring and training the model using MLflow runs6m 19s
-
(Locked)
Visualizing charts, metrics, and parameters on MLflow6m 52s
-
(Locked)
Setting up the objective function for hyperparameter tuning5m 35s
-
(Locked)
Hyperparameter optimization with Hyperopt and MLflow6m 21s
-
(Locked)
Identifying the best model3m 39s
-
(Locked)
Registering a model with the MLflow registry3m 12s
-
(Locked)
-
-