Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Information Theory Foundations
Information Theory Foundations
9 questions
Difficulty 1-7
View topic
Foundation
0 / 9
4 foundation
4 intermediate
1 advanced
Adapts to your performance
1 / 9
foundation (1/10)
compute
In classification, the cross-entropy loss for a single example with one-hot label
y
and predicted probabilities
p
^
reduces to:
Hide and think first
A.
∑
i
p
^
i
: the sum of all predicted probabilities
B.
−
lo
g
p
^
true class
: the negative log probability assigned to the correct class
C.
(
1
−
p
^
true class
)
2
: squared complement of the correct probability
D.
−
p
^
true class
: the negative probability of the correct class
Submit Answer