eluLayer
Exponential linear unit (ELU) layer
Description
An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
The layer performs the following operation:
The default value of α is 1. Specify a value of
α for the layer by setting the Alpha
property.
Creation
Description
creates an ELU
layer.layer
= eluLayer
creates an ELU layer and specifies the layer
= eluLayer(alpha
)Alpha
property.
Properties
Examples
Algorithms
References
[1] Clevert, Djork-Arné, Thomas Unterthiner, and Sepp Hochreiter. "Fast and accurate deep network learning by exponential linear units (ELUs)." arXiv preprint arXiv:1511.07289 (2015).
Extended Capabilities
Version History
Introduced in R2019a
See Also
trainnet
| trainingOptions
| dlnetwork
| batchNormalizationLayer
| leakyReluLayer
| clippedReluLayer
| reluLayer
| swishLayer