coding structure of gaussian noise layer

2 次查看(过去 30 天)
I want to create a special layer to add some special noise to the data.
But my matlab version is 2017b, I don't have the example " gaussianNoiseLayer.m".
That file should be located at (matlabroot, 'examples', 'nnet', 'main', 'gaussianNoiseLayer.m') in the matlab 2018b or 2018a version.
I really want to know the coding structure of adding noise layer.
If any kind-hearted person has installed the latest version of matlab, can you send a copy of this file to me?
email: xjy1236@sina.com thank you very much!!
  1 个评论
MAHSA YOUSEFI
MAHSA YOUSEFI 2021-1-4
Hi Jian.
Did you solve your problem with adding noise?
I want to add Gaussian noide per each layer of hidden layer and input in my costumized training loop.

请先登录,再进行评论。

回答(1 个)

Jack Xiao
Jack Xiao 2021-2-22
here is the code:
classdef gaussianNoiseLayer < nnet.layer.Layer
% gaussianNoiseLayer Gaussian noise layer
% A Gaussian noise layer adds random Gaussian noise to the input.
%
% To create a Gaussian noise layer, use
% layer = gaussianNoiseLayer(sigma, name)
properties
% Standard deviation.
Sigma
end
methods
function layer = gaussianNoiseLayer(sigma, name)
% layer = gaussianNoiseLayer(sigma,name) creates a Gaussian
% noise layer and specifies the standard deviation and layer
% name.
layer.Name = name;
layer.Description = ...
"Gaussian noise with standard deviation " + sigma;
layer.Type = "Gaussian Noise";
layer.Sigma = sigma;
end
function Z = predict(layer, X)
% Z = predict(layer, X) forwards the input data X through the
% layer for prediction and outputs the result Z.
% At prediction time, the output is equal to the input.
Z = X;
end
function [Z, memory] = forward(layer, X)
% Z = forward(layer, X) forwards the input data X through the
% layer and outputs the result Z.
% At training time, the layer adds Gaussian noise to the input.
sigma = layer.Sigma;
noise = randn(size(X)) * sigma;
Z = X + noise;
memory = [];
end
function dLdX = backward(layer, X, Z, dLdZ, memory)
% [dLdX, dLdAlpha] = backward(layer, X, Z, dLdZ, memory)
% backward propagates the derivative of the loss function
% through the layer.
% Since the layer adds a random constant, the derivative dLdX
% is equal to dLdZ.
dLdX = dLdZ;
end
end
end

类别

Help CenterFile Exchange 中查找有关 Image Data Workflows 的更多信息

产品


版本

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by