Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
KL Divergence
KL Divergence
6 questions
Difficulty 3-8
View topic
Foundation
0 / 6
1 foundation
3 intermediate
2 advanced
Adapts to your performance
1 / 6
foundation (3/10)
state theorem
Which of the following correctly states the data processing inequality in information theory?
Hide and think first
A.
If
X
→
Y
→
Z
is a Markov chain, then
I
(
X
;
Z
)
≤
I
(
Y
;
Z
)
: the further variable always has less mutual information with
Z
B.
For any three random variables,
I
(
X
;
Z
)
≤
I
(
X
;
Y
)
+
I
(
Y
;
Z
)
: mutual information satisfies a subadditivity bound
C.
If
X
→
Y
→
Z
is a Markov chain, then
H
(
Z
)
≤
H
(
Y
)
≤
H
(
X
)
: entropy strictly decreases along any Markov chain of processing
D.
If
X
→
Y
→
Z
is a Markov chain, then
I
(
X
;
Z
)
≤
I
(
X
;
Y
)
: processing
Y
to get
Z
cannot increase information about
X
Show Hint
Submit Answer