Trouble using input parameters on a custom deep learning layer
13 次查看(过去 30 天)
显示 更早的评论
I'm trying to perform feature selection methods on a deep learning layer. My initial idea was to use a learnable parameter that I inputted in the custom layer, with all the indexes of the vector I want to select with a '1' value. But when I call the layer.Variable, all the indexes changed its value and are consistent with what I initially inputted.
classdef globalindexPooling < nnet.layer.Layer
properties (Learnable)
Alpha
end
methods
function layer = globalindexPooling(numChannels, name)
% Initialize scaling coefficient
layer.Alpha = numChannels;
end
function Z = predict(layer, X)
input_size = [size(X,1) size(X,2) size(X,3)];
N = size(X,4);
X_reshaped = reshape(X,[prod(input_size) N]);
Z = X_reshaped(logical(gather(layer.Alpha)),:);
Z = reshape(Z,[6 6 256 N]);
end
function [Z, memory] = forward(layer, X)
input_size = [size(X,1) size(X,2) size(X,3)];
N = size(X,4);
X_reshaped = reshape(X,[prod(input_size) N]);
Z = X_reshaped(logical(gather(layer.Alpha)),:);
Z = reshape(Z,[6 6 256 N]);
memory = [];
end
function [dLdX,dLdalpha] = backward(layer, X, Z, dLdZ,~)
input_size = [size(X,1) size(X,2) size(X,3)];
N = size(X,4);
dLdZ = reshape(dLdZ,[6*6*256 N]);
dLdX_prima = zeros([prod(input_size) N],'like',X);
dLdX_prima(logical(gather(layer.Alpha)),:) = dLdZ;
dLdX = reshape(dLdX_prima,[input_size N]);
dLdalpha = ones(size(gather(layer.Alpha)),'like',X);
end
The input is a logical vector with ones in the selected indexes, but I got all 0.0784 and -0.9216, sometimes the first values being the original ones and other not. I tried using round() but since the assignment to each of them is inconsistent, it didn't work.
0 个评论
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Deep Learning Toolbox 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!