Main Content

connectLayers

Connect layers in neural network

Description

netUpdated = connectLayers(net,s,d) connects the source layer s to the destination layer d in the dlnetwork object net. The updated network, netUpdated, contains the same layers as net and includes the new connection.

example

Examples

collapse all

Create an empty neural network dlnetwork object and add an addition layer with two inputs and the name 'add'.

net = dlnetwork;
layer = additionLayer(2,'Name','add');
net = addLayers(net,layer);

Add two ReLU layers to the neural network and connect them to the addition layer. The addition layer outputs the sum of the outputs from the ReLU layers.

layer = reluLayer('Name','relu1');
net = addLayers(net,layer);
net = connectLayers(net,'relu1','add/in1');

layer = reluLayer('Name','relu2');
net = addLayers(net,layer);
net = connectLayers(net,'relu2','add/in2');

Visualize the updated network in a plot.

plot(net)

Figure contains an axes object. The axes object contains an object of type graphplot.

Define a two-output neural network that predicts both categorical labels and numeric values given 2-D images as input.

Specify the number of classes and responses.

numClasses = 10;
numResponses = 1;

Create an empty neural network.

net = dlnetwork;

Define the layers of the main branch of the network and the softmax output.

layers = [
    imageInputLayer([28 28 1],Normalization="none")

    convolution2dLayer(5,16,Padding="same")
    batchNormalizationLayer
    reluLayer(Name="relu_1")

    convolution2dLayer(3,32,Padding="same",Stride=2)
    batchNormalizationLayer
    reluLayer
    convolution2dLayer(3,32,Padding="same")
    batchNormalizationLayer
    reluLayer

    additionLayer(2,Name="add")

    fullyConnectedLayer(numClasses)
    softmaxLayer(Name="softmax")];

net = addLayers(net,layers);

Add the skip connection.

layers = [
    convolution2dLayer(1,32,Stride=2,Name="conv_skip")
    batchNormalizationLayer
    reluLayer(Name="relu_skip")];

net = addLayers(net,layers);
net = connectLayers(net,"relu_1","conv_skip");
net = connectLayers(net,"relu_skip","add/in2");

Add the fully connected layer for the regression output.

layers = fullyConnectedLayer(numResponses,Name="fc_2");
net = addLayers(net,layers);
net = connectLayers(net,"add","fc_2");

View the neural network in a plot.

figure
plot(net)

Figure contains an axes object. The axes object contains an object of type graphplot.

Input Arguments

collapse all

Neural network, specified as a dlnetwork object.

Connection source, specified as a character vector or a string scalar.

  • If the source layer has a single output, then s is the name of the layer.

  • If the source layer has multiple outputs, then s is the layer name followed by the "/" character and the name of the layer output: "layerName/outputName".

Example: "conv"

Example: "mpool/indices"

Connection destination, specified as a string scalar or a character vector.

  • If the destination layer has a single input, then d is the name of the layer.

  • If the destination layer has multiple inputs, then d is the layer name followed by the "/" character and the name of the layer input: "layerName/inputName".

Example: "fc"

Example: "add/in1"

Output Arguments

collapse all

Updated network, returned as an uninitialized dlnetwork object.

To initialize the learnable parameters of a dlnetwork object, use the initialize function.

The connectLayers function does not preserve quantization information. If the input network is a quantized network, then the output network does not contain quantization information.

Version History

Introduced in R2017b

expand all