Main Content

本页翻译不是最新的。点击此处可查看最新英文版本。

构建深度神经网络

使用 MATLAB® 代码或以交互方式使用深度网络设计器为图像数据构建神经网络

通过从头开始定义网络架构,为图像分类和回归等任务创建新的深度网络。使用 MATLAB 或以交互方式使用深度网络设计器构建网络。

对于大多数任务,您可以使用内置层。如果没有您的任务所需的内置层,则可以定义您自己的自定义层。您可以使用自定义输出层指定自定义损失函数,并定义具有可学习参数和状态参数的自定义层。定义自定义层后,您可以检查该层是否有效,是否与 GPU 兼容,以及是否输出正确定义的梯度。要查看支持的层的列表,请参阅深度学习层列表

对于层图不支持的模型,您可以将自定义模型定义为函数。要了解详细信息,请参阅定义自定义训练循环、损失函数和网络

App

深度网络设计器设计、可视化和训练深度学习网络

函数

全部展开

输入层

imageInputLayerImage input layer
image3dInputLayer3-D image input layer

卷积和全连接层

convolution2dLayer2-D convolutional layer
convolution3dLayer3-D convolutional layer
groupedConvolution2dLayer2-D grouped convolutional layer
transposedConv2dLayerTransposed 2-D convolution layer
transposedConv3dLayerTransposed 3-D convolution layer
fullyConnectedLayerFully connected layer

变换器层

selfAttentionLayerSelf-attention layer (自 R2023a 起)
positionEmbeddingLayerPosition embedding layer (自 R2023b 起)
sinusoidalPositionEncodingLayerSinusoidal position encoding layer (自 R2023b 起)
embeddingConcatenationLayerEmbedding concatenation layer (自 R2023b 起)
indexing1dLayer1-D indexing layer (自 R2023b 起)

神经 ODE 层

neuralODELayerNeural ODE layer (自 R2023b 起)

激活层

reluLayer修正线性单元 (ReLU) 层
leakyReluLayerLeaky Rectified Linear Unit (ReLU) layer
clippedReluLayerClipped Rectified Linear Unit (ReLU) layer
eluLayerExponential linear unit (ELU) layer
tanhLayer双曲正切 (tanh) 层
swishLayerSwish layer (自 R2021a 起)
geluLayerGaussian error linear unit (GELU) layer (自 R2022b 起)
sigmoidLayerSigmoid layer (自 R2020b 起)
softmaxLayerSoftmax 层
functionLayerFunction layer (自 R2021b 起)

归一化层

batchNormalizationLayerBatch normalization layer
groupNormalizationLayerGroup normalization layer (自 R2020b 起)
instanceNormalizationLayerInstance normalization layer (自 R2021a 起)
layerNormalizationLayerLayer normalization layer (自 R2021a 起)
crossChannelNormalizationLayer Channel-wise local response normalization layer

实用工具层

dropoutLayer丢弃层
crop2dLayer2-D crop layer
crop3dLayer3-D crop layer (自 R2019b 起)

池化和去池化层

averagePooling2dLayerAverage pooling layer
averagePooling3dLayer3-D average pooling layer
globalAveragePooling2dLayer2-D global average pooling layer (自 R2019b 起)
globalAveragePooling3dLayer3-D global average pooling layer (自 R2019b 起)
globalMaxPooling2dLayerGlobal max pooling layer (自 R2020a 起)
globalMaxPooling3dLayer3-D global max pooling layer (自 R2020a 起)
maxPooling2dLayerMax pooling layer
maxPooling3dLayer3-D max pooling layer
maxUnpooling2dLayerMax unpooling layer

组合层

additionLayerAddition layer
multiplicationLayerMultiplication layer (自 R2020b 起)
concatenationLayerConcatenation layer
depthConcatenationLayerDepth concatenation layer

输出层

classificationLayer分类输出层
regressionLayer回归输出层
layerGraph(Not recommended) Graph of network layers for deep learning
plot绘制神经网络架构
addLayersAdd layers to neural network
removeLayersRemove layers from neural network
replaceLayerReplace layer in neural network
connectLayersConnect layers in neural network
disconnectLayersDisconnect layers in neural network
DAGNetwork用于深度学习的有向无环图 (DAG) 网络
resnetLayers(Not recommended) Create 2-D residual network (自 R2021b 起)
resnet3dLayers(Not recommended) Create 3-D residual network (自 R2021b 起)
isequalCheck equality of neural networks (自 R2021a 起)
isequalnCheck equality of neural networks ignoring NaN values (自 R2021a 起)
analyzeNetworkAnalyze deep learning network architecture
dlnetworkDeep learning neural network (自 R2019b 起)
addInputLayerAdd input layer to network (自 R2022b 起)
summaryPrint network summary (自 R2022b 起)
initializeInitialize learnable and state parameters of a dlnetwork (自 R2021a 起)
networkDataLayoutDeep learning network data layout for learnable parameter initialization (自 R2022b 起)
checkLayerCheck validity of custom or function layer

主题

内置层

自定义层