Main Content

reluLayer

Rectified Linear Unit (ReLU) layer

Description

A ReLU layer performs a threshold operation to each element of the input, where any value less than zero is set to zero.

This operation is equivalent to

f(x)={x,x00,x<0.

Creation

Description

layer = reluLayer creates a ReLU layer.

layer = reluLayer('Name',Name) creates a ReLU layer and sets the optional Name property using a name-value pair. For example, reluLayer('Name','relu1') creates a ReLU layer with the name 'relu1'.

example

Properties

expand all

Layer name, specified as a character vector or string scalar. For Layer array input, the trainnet and dlnetwork functions automatically assign names to layers with the name "".

The ReLULayer object stores this property as a character vector.

Data Types: char | string

This property is read-only.

Number of inputs to the layer, returned as 1. This layer accepts a single input only.

Data Types: double

This property is read-only.

Input names, returned as {'in'}. This layer accepts a single input only.

Data Types: cell

This property is read-only.

Number of outputs from the layer, returned as 1. This layer has a single output only.

Data Types: double

This property is read-only.

Output names, returned as {'out'}. This layer has a single output only.

Data Types: cell

Examples

collapse all

Create a ReLU layer with the name relu1.

layer = reluLayer(Name="relu1")
layer = 
  ReLULayer with properties:

    Name: 'relu1'

Include a ReLU layer in a Layer array.

layers = [ ...
    imageInputLayer([28 28 1])
    convolution2dLayer(5,20)
    reluLayer
    maxPooling2dLayer(2,Stride=2)
    fullyConnectedLayer(10)
    softmaxLayer]
layers = 
  6x1 Layer array with layers:

     1   ''   Image Input       28x28x1 images with 'zerocenter' normalization
     2   ''   2-D Convolution   20 5x5 convolutions with stride [1  1] and padding [0  0  0  0]
     3   ''   ReLU              ReLU
     4   ''   2-D Max Pooling   2x2 max pooling with stride [2  2] and padding [0  0  0  0]
     5   ''   Fully Connected   10 fully connected layer
     6   ''   Softmax           softmax

More About

expand all

References

[1] Nair, Vinod, and Geoffrey E. Hinton. "Rectified linear units improve restricted boltzmann machines." In Proceedings of the 27th international conference on machine learning (ICML-10), pp. 807-814. 2010.

Extended Capabilities

C/C++ Code Generation
Generate C and C++ code using MATLAB® Coder™.

GPU Code Generation
Generate CUDA® code for NVIDIA® GPUs using GPU Coder™.

Version History

Introduced in R2016a