Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Naive Bayes
Naive Bayes
1 questions
Difficulty 6-6
View topic
Intermediate
0 / 1
1 intermediate
Adapts to your performance
1 / 1
intermediate (6/10)
conceptual
Naive Bayes classifies by computing
p
(
y
∣
x
)
∝
p
(
y
)
∏
j
p
(
x
j
∣
y
)
. The 'naive' assumption is conditional independence of features given the class. Despite this assumption being almost always wrong, Naive Bayes often performs well. Why?
Hide and think first
A.
Classification only requires correct ranking of
p
(
y
∣
x
)
across classes; the independence assumption can distort probabilities while preserving class ordering
B.
Naive Bayes uses MAP estimation with Laplace smoothing, which is provably more accurate than standard maximum likelihood estimation for all problems
C.
The factored product of marginal class-conditional distributions always provides a close approximation to the true joint feature distribution
D.
Most real-world datasets happen to have features that are approximately conditionally independent given the class label, validating the core assumption
Submit Answer