Skip to content

Module-level Graded Quiz: Shallow Neural Networks :Deep Learning with PyTorch (IBM AI Engineering Professional Certificate) Answers 2025

1. Question 1

  • ❌ Multiply weights

  • ❌ Linearly separate data

  • Introduce non-linearity by mapping inputs to [0, 1]

  • ❌ Map to [-1, 1]


2. Question 2

  • ❌ One hidden + output only

  • ❌ Two hidden layers

  • ❌ Two input layers

  • One input layer and one output layer


3. Question 3

  • ❌ Reduces parameters

  • ❌ Causes underfitting

  • ❌ Decreases flexibility

  • Increases model flexibility


4. Question 4

  • ❌ Add more layers

  • ❌ Adjust weights

  • ❌ Shift decision boundary

  • Apply a different activation function


5. Question 5

  • ❌ Too many neurons

  • ❌ Too few neurons

  • ❌ High learning rate

  • Insufficient training data


6. Question 6

  • ❌ Captured all patterns

  • Model cannot capture data complexity

  • ❌ Too many layers

  • ❌ Too complex


7. Question 7

  • ❌ Add hidden layers

  • Increase number of output neurons = number of classes

  • ❌ Use single neuron

  • ❌ Use sigmoid


8. Question 8

  • ❌ Mean Squared Error

  • ❌ Hinge Loss

  • ❌ Binary Cross Entropy

  • Cross Entropy


9. Question 9

  • Compute gradient for updating weights

  • ❌ Forward propagation

  • ❌ Reduce layers

  • ❌ Apply activation


10. Question 10

  • ❌ Softmax

  • ❌ Sigmoid

  • ❌ Tanh

  • ReLU


🧾 Summary Table

Q# Correct Answer
1 Sigmoid maps to [0,1]
2 Input + Output = 2-layer net
3 More neurons → more flexibility
4 Use another activation function
5 Overfitting due to little data
6 Underfitting = cannot capture complexity
7 Output neurons = number of classes
8 Cross Entropy
9 Backprop computes gradients
10 ReLU