Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Dimensionality Reduction Theory
Dimensionality Reduction Theory
3 questions
Difficulty 2-4
View topic
Foundation
0 / 3
2 foundation
1 intermediate
Adapts to your performance
1 / 3
foundation (2/10)
conceptual
PCA finds orthogonal directions that maximize variance in the data. Why is this useful for dimensionality reduction?
Hide and think first
A.
PCA is the only dimensionality reduction method that preserves distances between points exactly
B.
PCA always finds the directions that best separate classes, making it ideal for classification preprocessing
C.
PCA works only for data with at most 10 dimensions due to numerical constraints of eigendecomposition
D.
Projecting onto the top-
k
principal components preserves the most variance possible in
k
dimensions, losing the least information
Submit Answer