From the course: Deep Learning with Python: Hands-On Introduction to Deep Learning Models

Unlock this course with a free trial

Join today to access over 25,300 courses taught by industry experts.

Gradient descent

Gradient descent

- [Instructor] Gradient descent is a fundamental optimization technique used extensively in deep learning and many other machine learning algorithms. It aims to find the optimal parameters that minimize a given loss function. To grasp gradient descent, it's important to understand the concept of a loss function. Imagine you're training a model to predict something like house prices. After training on some data, the model makes predictions that might not perfectly match the actual prices. The loss function is a mathematical formula that measures how bad those predictions are compared to the real data. Essentially, it quantifies the difference between the predicted values and the true values. The goal when training a neural network is to continually adjust the network parameters to make the loss as small as possible, meaning the predictions are as close as possible to the actual values. Now let's imagine that for all possible values of a particular parameter theta, we map the value of…

Contents