Skip to content

Week 3 Quiz :Sequences, Time Series and Prediction (DeepLearning.AI TensorFlow Developer Professional Certificate) Answers 2025

1. Question 1

What’s the primary difference between a simple RNN and an LSTM?

  • ❌ LSTMs have multiple outputs

  • ❌ RNNs have a cell state

  • In addition to the H output, LSTMs have a cell state that runs across all cells

  • ❌ LSTMs have a single output

Explanation:
Simple RNN → only hidden state (H)
LSTM → hidden state (H) + cell state (C) for long-term memory.


2. Question 2

How do you clear TensorFlow’s previous session variables?

  • ❌ tf.cache.backend.clear_session()

  • tf.keras.backend.clear_session()

  • ❌ tf.keras.clear_session

  • ❌ tf.cache.clear_session()

Explanation:
tf.keras.backend.clear_session() resets graph state and frees memory.


3. Question 3

What does a Lambda layer do?

  • ❌ Changes shape only

  • ❌ No Lambda layers exist

  • ❌ Pauses training

  • Allows you to execute arbitrary code while training

Explanation:
Lambda allows custom operations:

Lambda(lambda x: x / 255.0)

4. Question 4

If X is the input, what are the outputs of an RNN?

  • ❌ Y

  • ❌ H

  • Ŷ (Y-hat) and H

  • ❌ H-hat and Y

Explanation:
Ŷ = prediction
H = hidden state passed to next cell.


5. Question 5

New loss function introduced, named after a statistician?

  • ❌ Hyatt

  • Huber loss

  • ❌ Hawking

  • ❌ Hubble

Explanation:
Huber loss = robust loss combining MSE + MAE benefits.


6. Question 6

What is sequence-to-vector with 30 RNN cells (0–29)?

  • ❌ Second cell output

  • Ŷ from the last cell

  • ❌ Average of all outputs

  • ❌ Total of all outputs

Explanation:
Sequence-to-vector = final output only.


7. Question 7

What does axis in tf.expand_dims do?

  • ❌ Dimension index to remove

  • ❌ Axis to expand around

  • ❌ Defines X vs Y

  • Defines the index where the new dimension is inserted

Example:

tf.expand_dims(tensor, axis=1)

8. Question 8

What happens with this model?

Bidirectional(LSTM(32)),
Bidirectional(LSTM(32)),
Dense(1)
  • ❌ Model fails because return_sequences=True needed after each

  • ❌ Model fails because same number of cells

  • ❌ Model will compile and run

  • Model will fail because the first LSTM needs return_sequences=True

Explanation:
Stacking RNNs requires the first one to return full sequences:

return_sequences=True

🧾 Summary Table

Q# Correct Answer Key Concept
1 LSTM has cell state RNN vs LSTM
2 tf.keras.backend.clear_session() Reset TF state
3 Lambda = custom operations Lambda layer
4 Ŷ and H RNN outputs
5 Huber loss Robust loss
6 Final cell Ŷ Sequence-to-vector
7 Insert new dimension axis expand_dims
8 Need return_sequences=True Stacking LSTMs