Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Activation Functions
Activation Functions
7 questions
Difficulty 1-5
View topic
Foundation
0 / 7
4 foundation
3 intermediate
Adapts to your performance
1 / 7
foundation (1/10)
compute
The ReLU activation function is defined as:
Hide and think first
A.
ReLU
(
x
)
=
tanh
(
x
)
: the hyperbolic tangent
B.
ReLU
(
x
)
=
1/
(
1
+
e
−
x
)
: the sigmoid function
C.
ReLU
(
x
)
=
max
(
0
,
x
)
: output
x
if positive, else 0
D.
ReLU
(
x
)
=
x
2
: the squared activation
Submit Answer