Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Decision Trees and Ensembles
Decision Trees and Ensembles
1 questions
Difficulty 4-4
View topic
Intermediate
0 / 1
1 intermediate
Adapts to your performance
1 / 1
intermediate (4/10)
conceptual
A classification tree selects splits by maximizing information gain (or equivalently, reducing Gini impurity). Why do trees tend to have high variance and benefit from ensembling?
Hide and think first
A.
Small data changes can alter the root split, cascading through the tree structure and producing very different trees from similar datasets
B.
Trees can only make axis-aligned splits, which limits their expressiveness
C.
Decision trees have an excessive number of free parameters relative to the training sample size, causing them to memorize noise in the data
D.
Trees use a fundamentally different loss function (Gini impurity or entropy) than other models, which introduces instability in the learned decision boundaries
Submit Answer