Module-level Graded Quiz: Logistic Regression Cross Entropy Loss :Deep Learning with PyTorch (IBM AI Engineering Professional Certificate) Answers 2025
1. Question 1
-
❌ To minimize variance
-
❌ To calculate misclassified samples
-
✅ To maximize the likelihood of the correct class
-
❌ To average squared errors
2. Question 2
-
❌ Minimizes misclassified samples
-
❌ Increases correct classification
-
❌ Speeds up training
-
✅ It results in a flat cost surface that can cause parameters to get stuck
3. Question 3
-
✅ The sigmoid function provides a smooth cost surface
-
❌ Reduces misclassified samples to zero
-
❌ Increases speed
-
❌ Easier to implement
4. Question 4
-
❌ Sum of squared residuals
-
✅ Negative logarithm of the likelihood
-
❌ Difference between predicted and actual
-
❌ Maximum likelihood
5. Question 5
-
❌ nn.BCELoss
-
❌ nn.MSELoss
-
❌ nn.SoftmaxLoss
-
✅ nn.CrossEntropyLoss
6. Question 6
-
❌ Faster than all optimizers
-
✅ Uses only a portion of the dataset to minimize loss
-
❌ Guarantees global minimum
-
❌ Increases learning rate automatically
7. Question 7
-
✅ Converts linear outputs into probabilities
-
❌ Thresholding
-
❌ Computes gradient
-
❌ Adjusts learning rate
8. Question 8
-
❌ Updates parameters
-
❌ Resets parameters
-
✅ Computes gradients w.r.t model parameters
-
❌ Calculates next loss
9. Question 9
-
❌ Averaging input data
-
✅ Using optimizer.step()
-
❌ Recalculating loss
-
❌ Increasing learning rate
10. Question 10
-
❌ 0–10
-
✅ 0–1
-
❌ –1 to 1
-
❌ 0–100
🧾 Summary Table
| Q# | Correct Answer |
|---|---|
| 1 | Maximize likelihood of correct class |
| 2 | Flat cost surface with MSE |
| 3 | Smooth cost surface |
| 4 | Negative log-likelihood |
| 5 | nn.CrossEntropyLoss |
| 6 | Uses portion of dataset (SGD) |
| 7 | Converts output to probabilities |
| 8 | Computes gradients |
| 9 | optimizer.step() |
| 10 | Output range = 0 to 1 |