Main Content
leakyReluLayer
Leaky Rectified Linear Unit (ReLU) layer
Description
A leaky ReLU layer performs a threshold operation, where any input value less than zero is multiplied by a fixed scalar.
This operation is equivalent to:
Creation
Properties
Examples
Algorithms
References
[1] Maas, Andrew L., Awni Y. Hannun, and Andrew Y. Ng. "Rectifier nonlinearities improve neural network acoustic models." In Proc. ICML, vol. 30, no. 1. 2013.
Extended Capabilities
Version History
Introduced in R2017b
See Also
trainnet
| trainingOptions
| preluLayer
| dlnetwork
| reluLayer
| clippedReluLayer
| swishLayer
| exportNetworkToSimulink
| Leaky ReLU
Layer