From the course: Deep Learning: Image Recognition
Unlock the full course today
Join today to access over 23,100 courses taught by industry experts.
Dropout - Python Tutorial
From the course: Deep Learning: Image Recognition
Dropout
- We have one more trick left up our sleeves to make our neural network perform better. Let's add dropout layers to it. Open up 07_dropout.py. One of the problems with neural networks is that they can tend to memorize the input data instead of actually learning how to tell different objects apart. We can force the neural network to try harder to learn without memorizing the input data. The idea is that between certain layers, we'll randomly throw away some of the data by cutting some of the connections between the layers. This is called dropout. Usually we'll add dropout right after max pulling layers, or after a group of dense layers. Let's go down to line 26 right after this max pulling layer. To add a new layer, we'll just call model.ad and we'll create a new dropout layer. The only parameter we need to pass in is the percentage of neural network connections to randomly cut. Usually a value between 25% and 50% works well. We'll use 25%. To do that, we pass in 0.25. Now let's add…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
-
(Locked)
Designing a neural network architecture for image recognition4m 7s
-
(Locked)
Exploring the CIFAR-10 data set2m 50s
-
(Locked)
Loading an image data set4m 6s
-
(Locked)
Dense layers3m 27s
-
(Locked)
Convolution layers5m 15s
-
(Locked)
Max pooling1m 40s
-
(Locked)
Dropout1m 54s
-
(Locked)
A complete neural network for image recognition2m 30s
-
(Locked)
-
-
-
-