Main Content

针对图像的深度网络自定义

自定义深度学习训练循环和损失函数

如果 trainingOptions 函数不提供任务所需的训练选项,或者自定义输出层不支持所需的损失函数,则您可以定义自定义训练循环。对于无法使用层图创建的网络,可以将自定义网络定义为函数。要了解详细信息,请参阅定义自定义训练循环、损失函数和网络

函数

全部展开

dlnetworkDeep learning network for custom training loops
forwardCompute deep learning network output for training
predictCompute deep learning network output for inference
adamupdateUpdate parameters using adaptive moment estimation (Adam)
rmspropupdate Update parameters using root mean squared propagation (RMSProp)
sgdmupdate Update parameters using stochastic gradient descent with momentum (SGDM)
dlupdate Update parameters using custom function
minibatchqueueCreate mini-batches for deep learning
onehotencodeEncode data labels into one-hot vectors
onehotdecodeDecode probability vectors into class labels
initializeInitialize learnable and state parameters of a dlnetwork
plotPlot neural network architecture
addLayersAdd layers to layer graph or network
removeLayersRemove layers from layer graph or network
connectLayersConnect layers in layer graph or network
disconnectLayersDisconnect layers in layer graph or network
replaceLayerReplace layer in layer graph or network
summaryPrint network summary
trainingProgressMonitorMonitor and plot training progress for deep learning custom training loops
dlarrayDeep learning array for custom training loops
dlgradientCompute gradients for custom training loops using automatic differentiation
dlfevalEvaluate deep learning model for custom training loops

输入层

imageInputLayerImage input layer
image3dInputLayer3-D image input layer

卷积和全连接层

convolution2dLayer2-D convolutional layer
convolution3dLayer3-D convolutional layer
groupedConvolution2dLayer2-D grouped convolutional layer
transposedConv2dLayerTransposed 2-D convolution layer
transposedConv3dLayerTransposed 3-D convolution layer
fullyConnectedLayerFully connected layer

激活层

reluLayer修正线性单元 (ReLU) 层
leakyReluLayerLeaky Rectified Linear Unit (ReLU) layer
clippedReluLayerClipped Rectified Linear Unit (ReLU) layer
eluLayerExponential linear unit (ELU) layer
tanhLayer双曲正切 (tanh) 层
swishLayerSwish layer
geluLayerGaussian error linear unit (GELU) layer
functionLayerFunction layer

归一化、丢弃和裁剪层

batchNormalizationLayerBatch normalization layer
groupNormalizationLayerGroup normalization layer
instanceNormalizationLayerInstance normalization layer
layerNormalizationLayerLayer normalization layer
crossChannelNormalizationLayer Channel-wise local response normalization layer
dropoutLayerDropout layer
crop2dLayer2-D crop layer
crop3dLayer3-D crop layer

池化和去池化层

averagePooling2dLayerAverage pooling layer
averagePooling3dLayer3-D average pooling layer
globalAveragePooling2dLayer2-D global average pooling layer
globalAveragePooling3dLayer3-D global average pooling layer
globalMaxPooling2dLayerGlobal max pooling layer
globalMaxPooling3dLayer3-D global max pooling layer
maxPooling2dLayerMax pooling layer
maxPooling3dLayer3-D max pooling layer
maxUnpooling2dLayerMax unpooling layer

组合层

additionLayerAddition layer
multiplicationLayerMultiplication layer
concatenationLayerConcatenation layer
depthConcatenationLayerDepth concatenation layer

主题