Main Content
reluLayer
Rectified Linear Unit (ReLU) layer
Description
A ReLU layer performs a threshold operation to each element of the input, where any value less than zero is set to zero.
This operation is equivalent to
Creation
Properties
Examples
More About
References
[1] Nair, Vinod, and Geoffrey E. Hinton. "Rectified linear units improve restricted boltzmann machines." In Proceedings of the 27th international conference on machine learning (ICML-10), pp. 807-814. 2010.
Extended Capabilities
Version History
Introduced in R2016a