Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Bagging
Bagging
3 questions
Difficulty 5-6
View topic
Intermediate
0 / 3
3 intermediate
Adapts to your performance
1 / 3
intermediate (5/10)
spot the error
Random forests extend bagging by introducing a second source of randomness at each tree node. What is this extra randomization, and why does it help?
Hide and think first
A.
Random forests use a random activation function per tree, similar to how neural networks use different nonlinearities
B.
Random forests subsample features at each split, decorrelating trees and reducing the variance of the ensemble average
C.
Random forests randomize the depth of each tree, with different trees having different fixed maximum depths
D.
Random forests subsample training points at each split, which is different from subsampling at the root for bagging
Submit Answer