Skip to content

Module 4 challenge :The Nuts and Bolts of Machine Learning (Google Advanced Data Analytics Professional Certificate) Answers 2025

1. Nodes examined in tree-based learning

❌ Root
✔ Decision
❌ Leaf
❌ Branch


2. Benefits of decision trees

✔ Decision trees enable data professionals to make predictions about future events based on currently available information.
✔ Very little preprocessing is required.
✔ No assumptions regarding distribution of data.
❌ Overfitting is unlikely (Overfitting is actually a common issue with decision trees).


3. Decision nodes can point to

❌ Split
✔ Leaf node
✔ Decision node
❌ Root node


4. Hyperparameter controlling number of trees

❌ max_features
❌ max_depth
❌ n_trees
✔ n_estimators


5. Tool for testing hyperparameters

✔ GridSearchCV
❌ Hyperparameter verification
❌ Model validation
❌ Cross validation


6. Ensemble learning

✔ A base learner slightly better than random guess = weak learner
✔ Ensemble predictions can be accurate even if individual models are weak
❌ Different types of models must be trained on completely different data (not required)
✔ Ensemble learning aggregates outputs of multiple models to make a final prediction


7. Random forest is an ensemble of

❌ observations
❌ variables
✔ base learners
❌ statements


8. Benefits of boosting

❌ Most interpretable model methodology
✔ Can handle both numeric and categorical features
✔ Powerful predictive methodology
✔ Does not require data to be scaled


9. Gradient boosting

✔ Works well with missing data
✔ Does not require data to be scaled
❌ Tells you coefficients for each feature (this is linear regression)
❌ Builds models in parallel (gradient boosting builds sequentially)


10. Gini, entropy, info gain, log loss

✔ To determine optimal split points of decision nodes
✔ To quantify purity/impurity of child nodes
❌ Ensure each class equally represented in each child node
❌ To prune tree and prevent overfitting


✅ Summary Table

Q No. Correct Answer(s)
1 Decision
2 1, 2, 3
3 Leaf node, Decision node
4 n_estimators
5 GridSearchCV
6 1, 2, 4
7 base learners
8 2, 3, 4
9 1, 2
10 1, 2