How to change the input size in first layer of Resnet?

8 次查看(过去 30 天)
I have sequential data (Signal dataset) which shape is 1024x1, and I want to train that on resnet by changing the input in first layer.
I have tried below link but its not editable.

回答(2 个)

Chunru
Chunru 2022-7-22
Change input layer may requires new training (not transfer learning):
nresnet = resnet50;
n = [imageInputLayer([112 112 3]); nresnet.Layers(2:end)]; % specify new size
n
n =
177×1 Layer array with layers: 1 '' Image Input 112×112×3 images with 'zerocenter' normalization 2 'conv1' Convolution 64 7×7×3 convolutions with stride [2 2] and padding [3 3 3 3] 3 'bn_conv1' Batch Normalization Batch normalization with 64 channels 4 'activation_1_relu' ReLU ReLU 5 'max_pooling2d_1' Max Pooling 3×3 max pooling with stride [2 2] and padding [1 1 1 1] 6 'res2a_branch2a' Convolution 64 1×1×64 convolutions with stride [1 1] and padding [0 0 0 0] 7 'bn2a_branch2a' Batch Normalization Batch normalization with 64 channels 8 'activation_2_relu' ReLU ReLU 9 'res2a_branch2b' Convolution 64 3×3×64 convolutions with stride [1 1] and padding 'same' 10 'bn2a_branch2b' Batch Normalization Batch normalization with 64 channels 11 'activation_3_relu' ReLU ReLU 12 'res2a_branch2c' Convolution 256 1×1×64 convolutions with stride [1 1] and padding [0 0 0 0] 13 'res2a_branch1' Convolution 256 1×1×64 convolutions with stride [1 1] and padding [0 0 0 0] 14 'bn2a_branch2c' Batch Normalization Batch normalization with 256 channels 15 'bn2a_branch1' Batch Normalization Batch normalization with 256 channels 16 'add_1' Addition Element-wise addition of 2 inputs 17 'activation_4_relu' ReLU ReLU 18 'res2b_branch2a' Convolution 64 1×1×256 convolutions with stride [1 1] and padding [0 0 0 0] 19 'bn2b_branch2a' Batch Normalization Batch normalization with 64 channels 20 'activation_5_relu' ReLU ReLU 21 'res2b_branch2b' Convolution 64 3×3×64 convolutions with stride [1 1] and padding 'same' 22 'bn2b_branch2b' Batch Normalization Batch normalization with 64 channels 23 'activation_6_relu' ReLU ReLU 24 'res2b_branch2c' Convolution 256 1×1×64 convolutions with stride [1 1] and padding [0 0 0 0] 25 'bn2b_branch2c' Batch Normalization Batch normalization with 256 channels 26 'add_2' Addition Element-wise addition of 2 inputs 27 'activation_7_relu' ReLU ReLU 28 'res2c_branch2a' Convolution 64 1×1×256 convolutions with stride [1 1] and padding [0 0 0 0] 29 'bn2c_branch2a' Batch Normalization Batch normalization with 64 channels 30 'activation_8_relu' ReLU ReLU 31 'res2c_branch2b' Convolution 64 3×3×64 convolutions with stride [1 1] and padding 'same' 32 'bn2c_branch2b' Batch Normalization Batch normalization with 64 channels 33 'activation_9_relu' ReLU ReLU 34 'res2c_branch2c' Convolution 256 1×1×64 convolutions with stride [1 1] and padding [0 0 0 0] 35 'bn2c_branch2c' Batch Normalization Batch normalization with 256 channels 36 'add_3' Addition Element-wise addition of 2 inputs 37 'activation_10_relu' ReLU ReLU 38 'res3a_branch2a' Convolution 128 1×1×256 convolutions with stride [2 2] and padding [0 0 0 0] 39 'bn3a_branch2a' Batch Normalization Batch normalization with 128 channels 40 'activation_11_relu' ReLU ReLU 41 'res3a_branch2b' Convolution 128 3×3×128 convolutions with stride [1 1] and padding 'same' 42 'bn3a_branch2b' Batch Normalization Batch normalization with 128 channels 43 'activation_12_relu' ReLU ReLU 44 'res3a_branch2c' Convolution 512 1×1×128 convolutions with stride [1 1] and padding [0 0 0 0] 45 'res3a_branch1' Convolution 512 1×1×256 convolutions with stride [2 2] and padding [0 0 0 0] 46 'bn3a_branch2c' Batch Normalization Batch normalization with 512 channels 47 'bn3a_branch1' Batch Normalization Batch normalization with 512 channels 48 'add_4' Addition Element-wise addition of 2 inputs 49 'activation_13_relu' ReLU ReLU 50 'res3b_branch2a' Convolution 128 1×1×512 convolutions with stride [1 1] and padding [0 0 0 0] 51 'bn3b_branch2a' Batch Normalization Batch normalization with 128 channels 52 'activation_14_relu' ReLU ReLU 53 'res3b_branch2b' Convolution 128 3×3×128 convolutions with stride [1 1] and padding 'same' 54 'bn3b_branch2b' Batch Normalization Batch normalization with 128 channels 55 'activation_15_relu' ReLU ReLU 56 'res3b_branch2c' Convolution 512 1×1×128 convolutions with stride [1 1] and padding [0 0 0 0] 57 'bn3b_branch2c' Batch Normalization Batch normalization with 512 channels 58 'add_5' Addition Element-wise addition of 2 inputs 59 'activation_16_relu' ReLU ReLU 60 'res3c_branch2a' Convolution 128 1×1×512 convolutions with stride [1 1] and padding [0 0 0 0] 61 'bn3c_branch2a' Batch Normalization Batch normalization with 128 channels 62 'activation_17_relu' ReLU ReLU 63 'res3c_branch2b' Convolution 128 3×3×128 convolutions with stride [1 1] and padding 'same' 64 'bn3c_branch2b' Batch Normalization Batch normalization with 128 channels 65 'activation_18_relu' ReLU ReLU 66 'res3c_branch2c' Convolution 512 1×1×128 convolutions with stride [1 1] and padding [0 0 0 0] 67 'bn3c_branch2c' Batch Normalization Batch normalization with 512 channels 68 'add_6' Addition Element-wise addition of 2 inputs 69 'activation_19_relu' ReLU ReLU 70 'res3d_branch2a' Convolution 128 1×1×512 convolutions with stride [1 1] and padding [0 0 0 0] 71 'bn3d_branch2a' Batch Normalization Batch normalization with 128 channels 72 'activation_20_relu' ReLU ReLU 73 'res3d_branch2b' Convolution 128 3×3×128 convolutions with stride [1 1] and padding 'same' 74 'bn3d_branch2b' Batch Normalization Batch normalization with 128 channels 75 'activation_21_relu' ReLU ReLU 76 'res3d_branch2c' Convolution 512 1×1×128 convolutions with stride [1 1] and padding [0 0 0 0] 77 'bn3d_branch2c' Batch Normalization Batch normalization with 512 channels 78 'add_7' Addition Element-wise addition of 2 inputs 79 'activation_22_relu' ReLU ReLU 80 'res4a_branch2a' Convolution 256 1×1×512 convolutions with stride [2 2] and padding [0 0 0 0] 81 'bn4a_branch2a' Batch Normalization Batch normalization with 256 channels 82 'activation_23_relu' ReLU ReLU 83 'res4a_branch2b' Convolution 256 3×3×256 convolutions with stride [1 1] and padding 'same' 84 'bn4a_branch2b' Batch Normalization Batch normalization with 256 channels 85 'activation_24_relu' ReLU ReLU 86 'res4a_branch2c' Convolution 1024 1×1×256 convolutions with stride [1 1] and padding [0 0 0 0] 87 'res4a_branch1' Convolution 1024 1×1×512 convolutions with stride [2 2] and padding [0 0 0 0] 88 'bn4a_branch2c' Batch Normalization Batch normalization with 1024 channels 89 'bn4a_branch1' Batch Normalization Batch normalization with 1024 channels 90 'add_8' Addition Element-wise addition of 2 inputs 91 'activation_25_relu' ReLU ReLU 92 'res4b_branch2a' Convolution 256 1×1×1024 convolutions with stride [1 1] and padding [0 0 0 0] 93 'bn4b_branch2a' Batch Normalization Batch normalization with 256 channels 94 'activation_26_relu' ReLU ReLU 95 'res4b_branch2b' Convolution 256 3×3×256 convolutions with stride [1 1] and padding 'same' 96 'bn4b_branch2b' Batch Normalization Batch normalization with 256 channels 97 'activation_27_relu' ReLU ReLU 98 'res4b_branch2c' Convolution 1024 1×1×256 convolutions with stride [1 1] and padding [0 0 0 0] 99 'bn4b_branch2c' Batch Normalization Batch normalization with 1024 channels 100 'add_9' Addition Element-wise addition of 2 inputs 101 'activation_28_relu' ReLU ReLU 102 'res4c_branch2a' Convolution 256 1×1×1024 convolutions with stride [1 1] and padding [0 0 0 0] 103 'bn4c_branch2a' Batch Normalization Batch normalization with 256 channels 104 'activation_29_relu' ReLU ReLU 105 'res4c_branch2b' Convolution 256 3×3×256 convolutions with stride [1 1] and padding 'same' 106 'bn4c_branch2b' Batch Normalization Batch normalization with 256 channels 107 'activation_30_relu' ReLU ReLU 108 'res4c_branch2c' Convolution 1024 1×1×256 convolutions with stride [1 1] and padding [0 0 0 0] 109 'bn4c_branch2c' Batch Normalization Batch normalization with 1024 channels 110 'add_10' Addition Element-wise addition of 2 inputs 111 'activation_31_relu' ReLU ReLU 112 'res4d_branch2a' Convolution 256 1×1×1024 convolutions with stride [1 1] and padding [0 0 0 0] 113 'bn4d_branch2a' Batch Normalization Batch normalization with 256 channels 114 'activation_32_relu' ReLU ReLU 115 'res4d_branch2b' Convolution 256 3×3×256 convolutions with stride [1 1] and padding 'same' 116 'bn4d_branch2b' Batch Normalization Batch normalization with 256 channels 117 'activation_33_relu' ReLU ReLU 118 'res4d_branch2c' Convolution 1024 1×1×256 convolutions with stride [1 1] and padding [0 0 0 0] 119 'bn4d_branch2c' Batch Normalization Batch normalization with 1024 channels 120 'add_11' Addition Element-wise addition of 2 inputs 121 'activation_34_relu' ReLU ReLU 122 'res4e_branch2a' Convolution 256 1×1×1024 convolutions with stride [1 1] and padding [0 0 0 0] 123 'bn4e_branch2a' Batch Normalization Batch normalization with 256 channels 124 'activation_35_relu' ReLU ReLU 125 'res4e_branch2b' Convolution 256 3×3×256 convolutions with stride [1 1] and padding 'same' 126 'bn4e_branch2b' Batch Normalization Batch normalization with 256 channels 127 'activation_36_relu' ReLU ReLU 128 'res4e_branch2c' Convolution 1024 1×1×256 convolutions with stride [1 1] and padding [0 0 0 0] 129 'bn4e_branch2c' Batch Normalization Batch normalization with 1024 channels 130 'add_12' Addition Element-wise addition of 2 inputs 131 'activation_37_relu' ReLU ReLU 132 'res4f_branch2a' Convolution 256 1×1×1024 convolutions with stride [1 1] and padding [0 0 0 0] 133 'bn4f_branch2a' Batch Normalization Batch normalization with 256 channels 134 'activation_38_relu' ReLU ReLU 135 'res4f_branch2b' Convolution 256 3×3×256 convolutions with stride [1 1] and padding 'same' 136 'bn4f_branch2b' Batch Normalization Batch normalization with 256 channels 137 'activation_39_relu' ReLU ReLU 138 'res4f_branch2c' Convolution 1024 1×1×256 convolutions with stride [1 1] and padding [0 0 0 0] 139 'bn4f_branch2c' Batch Normalization Batch normalization with 1024 channels 140 'add_13' Addition Element-wise addition of 2 inputs 141 'activation_40_relu' ReLU ReLU 142 'res5a_branch2a' Convolution 512 1×1×1024 convolutions with stride [2 2] and padding [0 0 0 0] 143 'bn5a_branch2a' Batch Normalization Batch normalization with 512 channels 144 'activation_41_relu' ReLU ReLU 145 'res5a_branch2b' Convolution 512 3×3×512 convolutions with stride [1 1] and padding 'same' 146 'bn5a_branch2b' Batch Normalization Batch normalization with 512 channels 147 'activation_42_relu' ReLU ReLU 148 'res5a_branch2c' Convolution 2048 1×1×512 convolutions with stride [1 1] and padding [0 0 0 0] 149 'res5a_branch1' Convolution 2048 1×1×1024 convolutions with stride [2 2] and padding [0 0 0 0] 150 'bn5a_branch2c' Batch Normalization Batch normalization with 2048 channels 151 'bn5a_branch1' Batch Normalization Batch normalization with 2048 channels 152 'add_14' Addition Element-wise addition of 2 inputs 153 'activation_43_relu' ReLU ReLU 154 'res5b_branch2a' Convolution 512 1×1×2048 convolutions with stride [1 1] and padding [0 0 0 0] 155 'bn5b_branch2a' Batch Normalization Batch normalization with 512 channels 156 'activation_44_relu' ReLU ReLU 157 'res5b_branch2b' Convolution 512 3×3×512 convolutions with stride [1 1] and padding 'same' 158 'bn5b_branch2b' Batch Normalization Batch normalization with 512 channels 159 'activation_45_relu' ReLU ReLU 160 'res5b_branch2c' Convolution 2048 1×1×512 convolutions with stride [1 1] and padding [0 0 0 0] 161 'bn5b_branch2c' Batch Normalization Batch normalization with 2048 channels 162 'add_15' Addition Element-wise addition of 2 inputs 163 'activation_46_relu' ReLU ReLU 164 'res5c_branch2a' Convolution 512 1×1×2048 convolutions with stride [1 1] and padding [0 0 0 0] 165 'bn5c_branch2a' Batch Normalization Batch normalization with 512 channels 166 'activation_47_relu' ReLU ReLU 167 'res5c_branch2b' Convolution 512 3×3×512 convolutions with stride [1 1] and padding 'same' 168 'bn5c_branch2b' Batch Normalization Batch normalization with 512 channels 169 'activation_48_relu' ReLU ReLU 170 'res5c_branch2c' Convolution 2048 1×1×512 convolutions with stride [1 1] and padding [0 0 0 0] 171 'bn5c_branch2c' Batch Normalization Batch normalization with 2048 channels 172 'add_16' Addition Element-wise addition of 2 inputs 173 'activation_49_relu' ReLU ReLU 174 'avg_pool' 2-D Global Average Pooling 2-D global average pooling 175 'fc1000' Fully Connected 1000 fully connected layer 176 'fc1000_softmax' Softmax softmax 177 'ClassificationLayer_fc1000' Classification Output crossentropyex with 'tench' and 999 other classes
  2 个评论
Fernando Perez
Fernando Perez 2022-7-22
Thanks,
However, my code is full of bugs. I'm trying to use my own data with one of the resnet examples but I'm facing all sorts of issues with functions I dont understant. For instance the the countEachLabel function gives only 0's, my label images look black and deeplabv3plusLayers still produces a wrong size error.
Is there any example with generic data sizes and number of classes? The tutorial in matlab does not takes you through what to do if your data is monochrome and you only have two classes.
Chunru
Chunru 2022-7-22
If you want to use resnet without retraining or transfer learning, the best approach is to resize your input image to be the same size of resnet input layer. You can change the final layers for transfer learning.

请先登录,再进行评论。


Harsh
Harsh 2024-5-3
编辑:Harsh 2024-5-3
Hi,
From what I can gather, you are trying to change the first layer of the resnet50 model in MATLAB.
"replaceLayer" method can also be used here, here's an example on how to use the "replaceLayer" function:
net = resnet50;
lgraph = layerGraph(net)
lgraph =
LayerGraph with properties: InputNames: {'input_1'} OutputNames: {'ClassificationLayer_fc1000'} Layers: [177x1 nnet.cnn.layer.Layer] Connections: [192x2 table]
firstLayer = lgraph.Layers(1);
inputLayer = imageInputLayer([28 28 1]);
lgraph = replaceLayer(lgraph,firstLayer.Name,inputLayer)
lgraph =
LayerGraph with properties: InputNames: {'imageinput'} OutputNames: {'ClassificationLayer_fc1000'} Layers: [177x1 nnet.cnn.layer.Layer] Connections: [192x2 table]
Please refer to the following link for more information on "replaceLayer":
I hope this helps, thanks!

类别

Help CenterFile Exchange 中查找有关 Image Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by