内置层
对于大多数任务,您可以使用内置层。如果没有您的任务所需的内置层,则可以定义您自己的自定义层。您可以定义具有可学习参数和状态参数的自定义层。定义自定义层后,您可以检查该层是否有效,是否与 GPU 兼容,以及是否输出正确定义的梯度。要查看支持的层的列表,请参阅深度学习层列表。
App
深度网络设计器 | 设计和可视化深度学习网络 |
函数
输入层
inputLayer | Input layer (自 R2023b 起) |
imageInputLayer | Image input layer |
image3dInputLayer | 3-D image input layer |
sequenceInputLayer | Sequence input layer |
featureInputLayer | Feature input layer |
卷积和全连接层
convolution1dLayer | 1-D convolutional layer (自 R2021b 起) |
convolution2dLayer | 2-D convolutional layer |
convolution3dLayer | 3-D convolutional layer |
groupedConvolution2dLayer | 2-D grouped convolutional layer |
transposedConv1dLayer | Transposed 1-D convolution layer (自 R2022a 起) |
transposedConv2dLayer | Transposed 2-D convolution layer |
transposedConv3dLayer | Transposed 3-D convolution layer |
fullyConnectedLayer | Fully connected layer |
循环层
lstmLayer | Long short-term memory (LSTM) layer for recurrent neural network (RNN) |
bilstmLayer | Bidirectional long short-term memory (BiLSTM) layer for recurrent neural network (RNN) |
gruLayer | Gated recurrent unit (GRU) layer for recurrent neural network (RNN) |
lstmProjectedLayer | Long short-term memory (LSTM) projected layer for recurrent neural network (RNN) (自 R2022b 起) |
gruProjectedLayer | Gated recurrent unit (GRU) projected layer for recurrent neural network (RNN) (自 R2023b 起) |
变换器层
selfAttentionLayer | Self-attention layer (自 R2023a 起) |
attentionLayer | Dot-product attention layer (自 R2024a 起) |
positionEmbeddingLayer | Position embedding layer (自 R2023b 起) |
sinusoidalPositionEncodingLayer | Sinusoidal position encoding layer (自 R2023b 起) |
embeddingConcatenationLayer | Embedding concatenation layer (自 R2023b 起) |
indexing1dLayer | 1-D indexing layer (自 R2023b 起) |
神经 ODE 层
neuralODELayer | Neural ODE layer (自 R2023b 起) |
deep.ode.options.ODE1 | Neural ODE solver options for nonstiff differential equations using Euler method (自 R2025a 起) |
deep.ode.options.ODE45 | Neural ODE solver options for nonstiff differential equations (自 R2025a 起) |
激活层
reluLayer | 修正线性单元 (ReLU) 层 |
leakyReluLayer | Leaky Rectified Linear Unit (ReLU) layer |
preluLayer | Parametrized Rectified Linear Unit (PReLU) layer (自 R2024a 起) |
clippedReluLayer | Clipped Rectified Linear Unit (ReLU) layer |
eluLayer | Exponential linear unit (ELU) layer |
tanhLayer | 双曲正切 (tanh) 层 |
swishLayer | Swish layer (自 R2021a 起) |
geluLayer | Gaussian error linear unit (GELU) layer (自 R2022b 起) |
softmaxLayer | Softmax 层 |
sigmoidLayer | sigmoid 层 |
softplusLayer | Softplus layer |
complexReluLayer | Complex rectified linear unit (ReLU) layer (自 R2025a 起) |
functionLayer | Function layer (自 R2021b 起) |
归一化层
batchNormalizationLayer | Batch normalization layer |
groupNormalizationLayer | Group normalization layer |
instanceNormalizationLayer | Instance normalization layer (自 R2021a 起) |
layerNormalizationLayer | Layer normalization layer (自 R2021a 起) |
crossChannelNormalizationLayer | Channel-wise local response normalization layer |
实用工具层
dropoutLayer | 丢弃层 |
spatialDropoutLayer | Spatial dropout layer (自 R2024a 起) |
flattenLayer | 展平层 |
crop2dLayer | 2-D crop layer |
crop3dLayer | 3-D crop layer |
scalingLayer | Scaling layer |
quadraticLayer | Quadratic layer |
identityLayer | Identity layer (自 R2024b 起) |
complexToRealLayer | Complex-to-real layer (自 R2024b 起) |
realToComplexLayer | Real-to-complex layer (自 R2024b 起) |
networkLayer | Network Layer (自 R2024a 起) |
reshapeLayer | Reshape layer (自 R2025a 起) |
permuteLayer | Permute layer (自 R2025a 起) |
池化和去池化层
averagePooling1dLayer | 1-D average pooling layer (自 R2021b 起) |
averagePooling2dLayer | Average pooling layer |
averagePooling3dLayer | 3-D average pooling layer |
adaptiveAveragePooling2dLayer | Adaptive average pooling 2-D layer (自 R2024a 起) |
globalAveragePooling1dLayer | 1-D global average pooling layer (自 R2021b 起) |
globalAveragePooling2dLayer | 2-D global average pooling layer |
globalAveragePooling3dLayer | 3-D global average pooling layer |
globalMaxPooling1dLayer | 1-D global max pooling layer (自 R2021b 起) |
globalMaxPooling2dLayer | Global max pooling layer |
globalMaxPooling3dLayer | 3-D global max pooling layer |
maxPooling1dLayer | 1-D max pooling layer (自 R2021b 起) |
maxPooling2dLayer | Max pooling layer |
maxPooling3dLayer | 3-D max pooling layer |
maxUnpooling2dLayer | Max unpooling layer |
组合层
dlnetwork | Deep learning neural network |
imagePretrainedNetwork | 适用于图像的预训练神经网络 (自 R2024a 起) |
resnetNetwork | 2-D residual neural network (自 R2024a 起) |
resnet3dNetwork | 3-D residual neural network (自 R2024a 起) |
dag2dlnetwork | Convert SeriesNetwork and DAGNetwork to
dlnetwork (自 R2024a 起) |
addLayers | 向神经网络添加层 |
removeLayers | Remove layers from neural network |
replaceLayer | Replace layer in neural network |
getLayer | Look up a layer by name or path (自 R2024a 起) |
connectLayers | 在神经网络中连接各层 |
disconnectLayers | Disconnect layers in neural network |
expandLayers | Expand network layers (自 R2024a 起) |
groupLayers | Group layers into network layers (自 R2024a 起) |
addInputLayer | Add input layer to network (自 R2022b 起) |
initialize | Initialize learnable and state parameters of neural network (自 R2021a 起) |
networkDataLayout | Deep learning network data layout for learnable parameter initialization (自 R2022b 起) |
setL2Factor | Set L2 regularization factor of layer learnable parameter |
getL2Factor | Get L2 regularization factor of layer learnable parameter |
setLearnRateFactor | Set learn rate factor of layer learnable parameter |
getLearnRateFactor | Get learn rate factor of layer learnable parameter |
plot | 绘制神经网络架构 |
summary | 打印网络摘要 (自 R2022b 起) |
analyzeNetwork | Analyze deep learning network architecture |
checkLayer | Check validity of custom or function layer |
isequal | Check equality of neural networks (自 R2021a 起) |
isequaln | Check equality of neural networks ignoring NaN
values (自 R2021a 起) |
主题
- 长短期记忆神经网络
了解长短期记忆 (LSTM) 神经网络。
- 创建简单的深度学习神经网络以用于分类
此示例说明如何创建和训练简单的卷积神经网络来进行深度学习分类。
- 针对回归训练卷积神经网络
此示例说明如何训练卷积神经网络来预测手写数字的旋转角度。
- 深度学习层列表
探索 MATLAB® 中的所有深度学习层。
- 使用深度网络设计器构建网络
在深度网络设计器中以交互方式构建和编辑深度学习网络。
- Create and Train Network with Nested Layers
This example shows how to create and train a network with nested layers using network layers. (自 R2024a 起)
- Example Deep Learning Networks Architectures
This example shows how to define simple deep learning neural networks for classification and regression tasks.
- Choose an AI Model
Explore options for choosing an AI model.
- 从深度网络设计器生成 MATLAB 代码
生成 MATLAB 代码,以便在深度网络设计器中重新创建设计网络。
- Multiple-Input and Multiple-Output Networks
Learn how to define and train deep learning networks with multiple inputs or multiple outputs.
精选示例
Build Image-to-Image Regression Network Using Deep Network Designer
Use Deep Network Designer to construct an image-to-image regression network for super resolution.
Multilabel Image Classification Using Deep Learning
Use transfer learning to train a deep learning model for multilabel image classification.
训练残差网络进行图像分类
此示例说明如何创建包含残差连接的深度学习神经网络,并针对 CIFAR-10 数据对其进行训练。残差连接是卷积神经网络架构中的常见元素。使用残差连接可以改善网络中的梯度流,从而可以训练更深的网络。
Train Network on Image and Feature Data
Train a network that classifies handwritten digits using both image and feature input data.
Interpretable Time Series Forecasting Using a Temporal Fusion Transformer
Forecast electricity usage using an interpretable temporal fusion transformer (TFT).
- 自 R2025a 起
- 打开实时脚本
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
选择网站
选择网站以获取翻译的可用内容,以及查看当地活动和优惠。根据您的位置,我们建议您选择:。
您也可以从以下列表中选择网站:
如何获得最佳网站性能
选择中国网站(中文或英文)以获得最佳网站性能。其他 MathWorks 国家/地区网站并未针对您所在位置的访问进行优化。
美洲
- América Latina (Español)
- Canada (English)
- United States (English)
欧洲
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)