Main Content
geluLayer
Description
A Gaussian error linear unit (GELU) layer weights the input by its probability under a Gaussian distribution.
This operation is given by
where erf denotes the error function.
Creation
Description
returns a GELU
layer.layer
= geluLayer
sets the optional layer
= geluLayer(Name=Value
)Approximation
and Name
properties using
name-value arguments. For example, geluLayer(Name="gelu")
creates a
GELU layer with the name "gelu"
.
Properties
Examples
Algorithms
References
[1] Hendrycks, Dan, and Kevin Gimpel. "Gaussian error linear units (GELUs)." Preprint, submitted June 27, 2016. https://arxiv.org/abs/1606.08415
Extended Capabilities
Version History
Introduced in R2022bSee Also
trainnet
| trainingOptions
| dlnetwork
| fullyConnectedLayer
| imageInputLayer
| sequenceInputLayer