Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Weight Initialization
Weight Initialization
1 questions
Difficulty 4-4
View topic
Intermediate
0 / 1
1 intermediate
Adapts to your performance
1 / 1
intermediate (4/10)
conceptual
Xavier (Glorot) initialization sets weight variance to
Var
(
W
)
=
2/
(
n
in
+
n
out
)
. What problem does this solve?
Hide and think first
A.
It breaks symmetry by ensuring all neurons in each layer learn maximally different feature representations from the very first training step
B.
It transforms the non-convex loss landscape into a locally convex region near the initialization point, enabling gradient descent convergence
C.
It guarantees convergence to a global minimum
D.
It preserves the variance of activations across layers during both forward and backward passes, preventing signal explosion or vanishing
Submit Answer