Week 2 Quiz :Sequences, Time Series and Prediction (DeepLearning.AI TensorFlow Developer Professional Certificate) Answers 2025
1. Question 1 — What does MAE stand for?
-
❌ Mean Average Error
-
❌ Mean Advanced Error
-
✅ Mean Absolute Error
-
❌ Mean Active Error
Explanation:
MAE = average of absolute differences between predictions and actual values.
2. Question 2 — Correct code to split window into features + label
-
❌ window[n-1], window[1]
-
✅
dataset = dataset.map(lambda window: (window[:-1], window[-1:])) -
❌ window[-1:], window[:-1]
-
❌ window[n], window[1]
Explanation:window[:-1] = all except last → featureswindow[-1:] = last element → label
3. Question 3 — How to inspect learned parameters?
-
❌ Decompile model
-
❌ Run model with unit data
-
❌ Iterate layers blindly
-
✅ Assign the layer to a variable, add it to the model, inspect after training
Explanation:
If you keep a reference like:
dense = tf.keras.layers.Dense(10)
model.add(dense)
dense.get_weights()
You can inspect weights easily.
4. Question 4 — Change learning rate after each epoch
-
❌ LearningRateScheduler in model.compile()
-
❌ Custom callback modifying SGD directly
-
✅ Use LearningRateScheduler in callbacks of
model.fit() -
❌ You can’t set it
Explanation:
Use:
callbacks=[tf.keras.callbacks.LearningRateScheduler(fn)]
5. Question 5 — What does drop_remainder=True do?
-
❌ Adds data
-
❌ Crops data
-
❌ Ensures all data is used
-
✅ Ensures all batches have the same shape (drops incomplete final batch)
Explanation:
If leftover data doesn’t fill a full batch, it gets dropped.
6. Question 6 — What does MSE stand for?
-
✅ Mean Squared Error
-
❌ Mean Slight Error
-
❌ Mean Series Error
-
❌ Mean Second Error
7. Question 7 — Correct train/validation split
-
❌ First option
-
❌ Second option
-
✅
time_train = time[:split_time]x_train = series[:split_time]time_valid = time[split_time:]x_valid = series[split_time:] -
❌ Fourth option
Explanation:
Before split_time → training
After split_time → validation
8. Question 8 — How to set learning rate of SGD optimizer?
-
❌ RateOfLearning
-
❌ Can’t set
-
❌ Rate
-
✅ learning_rate
Example:
optimizer = tf.keras.optimizers.SGD(learning_rate=1e-4)
9. Question 9 — What is a windowed dataset?
-
❌ Consistent set of subsets
-
❌ Time series aligned to a fixed shape
-
❌ No such thing
-
✅ A fixed-size subset of a time series
Explanation:
A “window” = a slice of consecutive time steps.
🧾 Summary Table
| Q# | Correct Answer | Key Concept |
|---|---|---|
| 1 | Mean Absolute Error | Loss metric |
| 2 | (window[:-1], window[-1:]) |
Window split |
| 3 | Assign layer variable & inspect | Extracting weights |
| 4 | LearningRateScheduler in fit() | Dynamic learning rates |
| 5 | Drops incomplete final batch | drop_remainder |
| 6 | Mean Squared Error | Popular loss metric |
| 7 | Proper slicing for split | Train/validation split |
| 8 | learning_rate property | Optimizer config |
| 9 | Fixed-size subset | Windowed dataset |