Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Cross-Entropy Loss Deep Dive
Cross-Entropy Loss Deep Dive
4 questions
Difficulty 3-8
View topic
Foundation
0 / 4
1 foundation
2 intermediate
1 advanced
Adapts to your performance
1 / 4
foundation (3/10)
compute
In binary classification, what happens to the cross-entropy loss when the model predicts probability 0.01 for the true class?
Hide and think first
A.
The loss is exactly 1.0 because the model made a wrong prediction, and cross-entropy assigns unit loss to each misclassification
B.
The loss is 0.01 because cross-entropy directly equals the predicted probability for the incorrect class in binary classification
C.
The loss is -log(0.01) which equals approximately 4.6, a very high penalty since the model is confidently wrong about this example
D.
The loss is 0.99 because binary cross-entropy measures the complement of the predicted probability for the correct class
Show Hint
Submit Answer