From the course: Neural Networks and Convolutional Neural Networks Essential Training

Unlock this course with a free trial

Join today to access over 25,300 courses taught by industry experts.

Backpropagation and hyperparameters

Backpropagation and hyperparameters

We've built our neural network architecture, we've chosen our activation functions, and we're ready to tackle complex problems. But there's still one massive challenge. How do you teach a neural network with millions of connections to get better? Now, when our network makes a mistake recognizing a handwritten six, which of the millions of weights should we adjust, and by how much? It's like having a sports team where everyone played poorly, and you need to give each player specific feedback to improve. Now, in 1985, three researchers, Hinton, Rommelhart, and Williams, solved this puzzle with an algorithm that changed everything. And they called it backpropagation. Now, the name sounds intimidating, but the concept is beautifully simple. It's like having the world's most efficient coaching system that can simultaneously give personalized feedback to millions of players. So here's how it works. First, our neural network makes a prediction. So let's say it looks at the handwritten six…

Contents