clippedReluLayer
Clipped Rectified Linear Unit (ReLU) layer
Description
A clipped ReLU layer performs a threshold operation, where any input value less than zero is set to zero and any value above the clipping ceiling is set to that clipping ceiling.
This operation is equivalent to:
This clipping prevents the output from becoming too large.
Creation
Properties
Examples
Algorithms
References
[1] Hannun, Awni, Carl Case, Jared Casper, Bryan Catanzaro, Greg Diamos, Erich Elsen, Ryan Prenger, et al. "Deep speech: Scaling up end-to-end speech recognition." Preprint, submitted 17 Dec 2014. http://arxiv.org/abs/1412.5567
Extended Capabilities
Version History
Introduced in R2017b
See Also
trainnet
| trainingOptions
| dlnetwork
| reluLayer
| leakyReluLayer
| swishLayer
| exportNetworkToSimulink
| Clipped ReLU
Layer