Main Content

本页翻译不是最新的。点击此处可查看最新英文版本。

自定义训练循环

自定义图像网络的深度学习训练循环和损失函数

如果 trainingOptions 函数不提供任务所需的训练选项,或者自定义输出层不支持所需的损失函数,则您可以定义自定义训练循环。对于无法使用层图创建的网络,可以将自定义网络定义为函数。要了解详细信息,请参阅定义自定义训练循环、损失函数和网络

函数

全部展开

dlnetworkDeep learning neural network (自 R2019b 起)
trainingProgressMonitorMonitor and plot training progress for deep learning custom training loops (自 R2022b 起)
minibatchqueueCreate mini-batches for deep learning (自 R2020b 起)
dlarrayDeep learning array for customization (自 R2019b 起)
dlgradientCompute gradients for custom training loops using automatic differentiation (自 R2019b 起)
dlfevalEvaluate deep learning model for custom training loops (自 R2019b 起)
crossentropyCross-entropy loss for classification tasks (自 R2019b 起)
l1lossL1 loss for regression tasks (自 R2021b 起)
l2lossL2 loss for regression tasks (自 R2021b 起)
huberHuber loss for regression tasks (自 R2021a 起)
mseHalf mean squared error (自 R2019b 起)
dlconvDeep learning convolution (自 R2019b 起)
dltranspconvDeep learning transposed convolution (自 R2019b 起)
fullyconnectSum all weighted input data and apply a bias (自 R2019b 起)
batchnormNormalize data across all observations for each channel independently (自 R2019b 起)
crosschannelnormCross channel square-normalize using local responses (自 R2020a 起)
groupnormNormalize data across grouped subsets of channels for each observation independently (自 R2020b 起)
instancenormNormalize across each channel for each observation independently (自 R2021a 起)
layernormNormalize data across all channels for each observation independently (自 R2021a 起)
avgpoolPool data to average values over spatial dimensions (自 R2019b 起)
maxpoolPool data to maximum value (自 R2019b 起)
maxunpoolUnpool the output of a maximum pooling operation (自 R2019b 起)
relu应用修正线性单元激活 (自 R2019b 起)
leakyreluApply leaky rectified linear unit activation (自 R2019b 起)
geluApply Gaussian error linear unit (GELU) activation (自 R2022b 起)
softmaxApply softmax activation to channel dimension (自 R2019b 起)
sigmoid应用 sigmoid 激活 (自 R2019b 起)

主题

自定义训练循环

自动微分