Main Content

构建深度神经网络

使用 MATLAB® 代码或以交互方式使用深度网络设计器为序列和表格数据构建网络

通过从头开始定义网络架构,为分类、回归和预测等任务创建新的深度网络。使用 MATLAB 或以交互方式使用深度网络设计器构建网络。

对于大多数任务,您可以使用内置层。如果没有您的任务所需的内置层,则可以定义您自己的自定义层。您可以使用自定义输出层指定自定义损失函数,并定义具有可学习参数和状态参数的自定义层。定义自定义层后,您可以检查该层是否有效,是否与 GPU 兼容,以及是否输出正确定义的梯度。要查看支持的层的列表,请参阅深度学习层列表

对于层图不支持的模型,您可以将自定义模型定义为函数。要了解详细信息,请参阅定义自定义训练循环、损失函数和网络

App

深度网络设计器设计、可视化和训练深度学习网络

函数

全部展开

输入层

sequenceInputLayerSequence input layer
featureInputLayerFeature input layer

循环层

lstmLayerLong short-term memory (LSTM) layer for recurrent neural network (RNN)
bilstmLayerBidirectional long short-term memory (BiLSTM) layer for recurrent neural network (RNN)
gruLayerGated recurrent unit (GRU) layer for recurrent neural network (RNN)
lstmProjectedLayerLong short-term memory (LSTM) projected layer for recurrent neural network (RNN)

卷积层、注意力层和全连接层

convolution1dLayer1-D convolutional layer
transposedConv1dLayerTransposed 1-D convolution layer
selfAttentionLayerSelf-attention layer
fullyConnectedLayerFully connected layer

激活层和丢弃层

reluLayer修正线性单元 (ReLU) 层
leakyReluLayerLeaky Rectified Linear Unit (ReLU) layer
clippedReluLayerClipped Rectified Linear Unit (ReLU) layer
eluLayerExponential linear unit (ELU) layer
tanhLayer双曲正切 (tanh) 层
swishLayerSwish layer
geluLayerGaussian error linear unit (GELU) layer
sigmoidLayerSigmoid layer
softmaxLayerSoftmax 层
dropoutLayerDropout layer
functionLayerFunction layer

归一化层

batchNormalizationLayerBatch normalization layer
groupNormalizationLayerGroup normalization layer
instanceNormalizationLayerInstance normalization layer
layerNormalizationLayerLayer normalization layer
crossChannelNormalizationLayer Channel-wise local response normalization layer

池化层

maxPooling1dLayer1-D max pooling layer
averagePooling1dLayer1-D average pooling layer
globalMaxPooling1dLayer1-D global max pooling layer
globalAveragePooling1dLayer1-D global average pooling layer

组合层

additionLayerAddition layer
multiplicationLayerMultiplication layer
concatenationLayerConcatenation layer
depthConcatenationLayerDepth concatenation layer

数据操作

sequenceFoldingLayerSequence folding layer
sequenceUnfoldingLayerSequence unfolding layer
flattenLayerFlatten layer

输出层

classificationLayer分类输出层
regressionLayer回归输出层
layerGraphGraph of network layers for deep learning
plotPlot neural network architecture
addLayersAdd layers to layer graph or network
removeLayersRemove layers from layer graph or network
replaceLayerReplace layer in layer graph or network
connectLayersConnect layers in layer graph or network
disconnectLayersDisconnect layers in layer graph or network
DAGNetworkDirected acyclic graph (DAG) network for deep learning
isequalCheck equality of deep learning layer graphs or networks
isequalnCheck equality of deep learning layer graphs or networks ignoring NaN values
analyzeNetworkAnalyze deep learning network architecture
dlnetworkDeep learning network for custom training loops
addInputLayerAdd input layer to network
summaryPrint network summary
initializeInitialize learnable and state parameters of a dlnetwork
networkDataLayoutDeep learning network data layout for learnable parameter initialization
checkLayerCheck validity of custom or function layer

主题

内置层

自定义层