From the course: TensorFlow: Practical Skills in Constructing, Training, and Optimizing Models
Unlock this course with a free trial
Join today to access over 24,800 courses taught by industry experts.
Use Dropout for regularization - TensorFlow Tutorial
From the course: TensorFlow: Practical Skills in Constructing, Training, and Optimizing Models
Use Dropout for regularization
- [Instructor] If you have experience building machine learning models, then you have probably worked with regularizers, such as Lasso and Ridge. Neural networks allow for the possibility of a new kind of regularization called dropout, which will simply toggle some of the nodes in your network off and back on during training. This will prevent certain pathways from becoming overused, which would be a telltale sign of overfitting to your data. In this lesson, I'll walk you through the process of adding dropout to your neural network model. Go ahead and open up 04_04_dropout.ipynb. In TensorFlow, dropout is simply added as though it were a separate layer, and you'll add it immediately after the layer to which you want it to apply. The only thing you need to supply is the rate, which is the ratio of nodes to shut off randomly during each iteration of the training. Here I have imported TensorFlow, along with the sequential model type and the layers module from the Keras library. I have…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.