Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Data Preprocessing and Feature Engineering
Data Preprocessing and Feature Engineering
3 questions
Difficulty 2-3
View topic
Foundation
0 / 3
3 foundation
Adapts to your performance
1 / 3
foundation (2/10)
conceptual
Feature scaling (standardization or min-max normalization) is a common preprocessing step for many ML algorithms. Why does it matter?
Hide and think first
A.
Scaling is required to make the loss function convex for all supervised learning problems
B.
Scaling reduces the variance of predictions by a factor proportional to the input range, making the model more generalizable
C.
All supervised ML algorithms mathematically require standardized inputs; without scaling, they produce incorrect predictions
D.
Algorithms using distances, gradients, or regularization treat features with larger numeric ranges as more important unless scales are equalized
Submit Answer