Main Content

构建深度神经网络

使用命令行函数或以交互方式使用深度网络设计器构建网络

使用 MATLAB® 代码或以交互方式使用深度网络设计器从头开始构建网络。使用内置层为分类和回归等任务构建网络。要查看内置层的列表,请参阅深度学习层列表。然后,您可以分析您的网络以了解网络架构,并在训练前检查问题。

如果内置层没有提供您的任务所需的层,则您可以定义自己的自定义深度学习层。您可以使用自定义输出层指定自定义损失函数,并定义具有或不具有可学习参数的自定义层。定义自定义层后,您可以检查该层是否有效,是否与 GPU 兼容,以及是否输出正确定义的梯度。

对于无法使用层图创建的网络,可以将自定义网络定义为函数。有关如何训练定义为函数的深度学习模型的示例,请参阅Train Network Using Model Function

App

深度网络设计器设计、可视化和训练深度学习网络

函数

全部展开

输入层

imageInputLayerImage input layer
image3dInputLayer3-D image input layer
sequenceInputLayerSequence input layer
featureInputLayerFeature input layer

卷积和全连接层

convolution2dLayer2-D convolutional layer
convolution3dLayer3-D convolutional layer
groupedConvolution2dLayer2-D grouped convolutional layer
transposedConv2dLayerTransposed 2-D convolution layer
transposedConv3dLayerTransposed 3-D convolution layer
fullyConnectedLayerFully connected layer
selfAttentionLayerSelf-attention layer

循环层

lstmLayerLong short-term memory (LSTM) layer for recurrent neural network (RNN)
bilstmLayerBidirectional long short-term memory (BiLSTM) layer for recurrent neural network (RNN)
gruLayerGated recurrent unit (GRU) layer for recurrent neural network (RNN)
lstmProjectedLayerLong short-term memory (LSTM) projected layer for recurrent neural network (RNN)

激活层

reluLayer修正线性单元 (ReLU) 层
leakyReluLayerLeaky Rectified Linear Unit (ReLU) layer
clippedReluLayerClipped Rectified Linear Unit (ReLU) layer
eluLayerExponential linear unit (ELU) layer
tanhLayer双曲正切 (tanh) 层
swishLayerSwish layer
geluLayerGaussian error linear unit (GELU) layer
softmaxLayerSoftmax 层
sigmoidLayerSigmoid layer
functionLayerFunction layer

归一化层

batchNormalizationLayerBatch normalization layer
groupNormalizationLayerGroup normalization layer
instanceNormalizationLayerInstance normalization layer
layerNormalizationLayerLayer normalization layer
crossChannelNormalizationLayer Channel-wise local response normalization layer

实用工具层

dropoutLayerDropout layer
crop2dLayer2-D crop layer
crop3dLayer3-D crop layer

数据操作

sequenceFoldingLayerSequence folding layer
sequenceUnfoldingLayerSequence unfolding layer
flattenLayerFlatten layer

池化和去池化层

averagePooling2dLayerAverage pooling layer
averagePooling3dLayer3-D average pooling layer
globalAveragePooling2dLayer2-D global average pooling layer
globalAveragePooling3dLayer3-D global average pooling layer
globalMaxPooling2dLayerGlobal max pooling layer
globalMaxPooling3dLayer3-D global max pooling layer
maxPooling2dLayerMax pooling layer
maxPooling3dLayer3-D max pooling layer
maxUnpooling2dLayerMax unpooling layer

组合层

additionLayerAddition layer
multiplicationLayerMultiplication layer
concatenationLayerConcatenation layer
depthConcatenationLayerDepth concatenation layer

输出层

classificationLayer分类输出层
regressionLayer回归输出层
layerGraphGraph of network layers for deep learning
plotPlot neural network architecture
addLayersAdd layers to layer graph or network
removeLayersRemove layers from layer graph or network
replaceLayerReplace layer in layer graph or network
connectLayersConnect layers in layer graph or network
disconnectLayersDisconnect layers in layer graph or network
DAGNetworkDirected acyclic graph (DAG) network for deep learning
resnetLayersCreate 2-D residual network
resnet3dLayersCreate 3-D residual network
isequalCheck equality of deep learning layer graphs or networks
isequalnCheck equality of deep learning layer graphs or networks ignoring NaN values
analyzeNetworkAnalyze deep learning network architecture
resetStateReset state parameters of neural network
dlnetworkDeep learning network for custom training loops
addInputLayerAdd input layer to network
summaryPrint network summary
initializeInitialize learnable and state parameters of a dlnetwork
networkDataLayoutDeep learning network data layout for learnable parameter initialization
checkLayerCheck validity of custom or function layer
setL2FactorSet L2 regularization factor of layer learnable parameter
getL2FactorGet L2 regularization factor of layer learnable parameter
setLearnRateFactorSet learn rate factor of layer learnable parameter
getLearnRateFactorGet learn rate factor of layer learnable parameter

主题

内置层

自定义层