Week 3 Quiz :Convolutional Neural Networks in TensorFlow (DeepLearning.AI TensorFlow Developer Professional Certificate) Answers 2025
1. Question 1
If I put a dropout parameter of 0.2, how many nodes will I lose?
-
✅ 20% of them
-
❌ 2% of them
-
❌ 20% of the untrained ones
-
❌ 2% of the untrained ones
Explanation: Dropout(0.2) randomly disables 20% of neurons during each forward pass in training.
2. Question 2
How do you change the number of classes when using transfer learning?
-
❌ Ignore all classes above yours
-
❌ Use all classes but set weights to 0
-
✅ When you add your DNN at the bottom, specify your output layer with the number of classes you want
-
❌ Use dropouts to eliminate classes
Explanation: In transfer learning, the base model stays; you replace the final Dense layer with your own output layer (e.g., Dense(2) for 2 classes).
3. Question 3
Which is the correct line of code for declaring dropout of 20%?
-
❌ tf.keras.layers.Dropout(20)
-
❌ tf.keras.layers.DropoutNeurons(20)
-
✅ tf.keras.layers.Dropout(0.2)
-
❌ tf.keras.layers.DropoutNeurons(0.2)
Explanation: Dropout takes a fraction (0–1), not a percentage or neuron count.
4. Question 4
Why do dropouts help avoid overfitting?
-
✅ Because neighbor neurons can have similar weights, and thus can skew the final training
-
❌ Having fewer neurons speeds up training
Explanation: Dropout breaks co-adaptation between neurons. They can’t rely on each other, so they learn stronger generalizable features.
5. Question 5
Why is transfer learning useful?
-
❌ Use all original training data
-
❌ Use all original validation data
-
✅ Use features learned from large datasets you may not have access to
-
❌ Use validation metadata from large datasets
Explanation: Transfer learning imports pre-learned features from huge datasets (like ImageNet) so you benefit without training from scratch.
6. Question 6
Can you use image augmentation with transfer learning?
-
❌ No, because features are pre-set
-
✅ Yes, you can augment when training the layers you added
Explanation: Data augmentation improves the new classifier layers you train.
7. Question 7
How do you freeze a layer from retraining?
-
❌ tf.freeze(layer)
-
❌ tf.layer.frozen = True
-
❌ tf.layer.locked = True
-
✅ layer.trainable = False
Explanation: Setting trainable=False tells TensorFlow not to update weights for that layer.
8. Question 8
What happens if dropout rate is too high?
-
✅ The network becomes ineffective at learning
-
❌ Training time increases due to extra calculations
Explanation: Too much dropout removes too many neurons, making learning weak and accuracy low.
🧾 Summary Table
| Q# | Correct Answer | Key Concept |
|---|---|---|
| 1 | 20% of neurons | Dropout rate interpretation |
| 2 | Replace final output layer | Transfer learning output classes |
| 3 | Dropout(0.2) | Correct TF syntax |
| 4 | Prevents co-adaptation | Dropout avoids overfitting |
| 5 | Uses pre-learned features | Benefit of transfer learning |
| 6 | Yes, augmentation allowed | Works with custom layers |
| 7 | layer.trainable = False | Freezing layers |
| 8 | Network becomes ineffective | Dropout too high |