Skip to main content
Theorem
Path
Curriculum
Paths
Labs
Diagnostic
Case Study
Blog
Search
Sign in
Quiz Hub
/
Linear Layer: Shapes, Bias, and Memory
Linear Layer: Shapes, Bias, and Memory
5 selected
Difficulty 2-4
5 unseen
View topic
Foundation
New
0 answered
3 foundation
2 intermediate
Adapts to your performance
Question 1 of 5
120s
foundation (2/10)
compute
For
X
∈
R
B
×
D
in
,
W
∈
R
D
in
×
D
out
,
b
∈
R
D
out
, what is the shape of
Y
=
X
W
+
b
?
Hide and think first
A.
R
B
×
D
out
— the matmul collapses
D
in
, and the bias broadcasts across the batch dimension
B
.
B.
R
D
in
×
D
out
— the same shape as
W
, since matmul preserves the inner shape rather than the outer one.
C.
R
B
×
D
in
— the same shape as
X
, since the linear layer rotates the features in place without reshaping.
D.
R
B
×
D
in
×
D
out
— broadcasting the bias adds a new axis to the output tensor.
Show Hint
Submit Answer
I don't know