scale data for NN

Hi,
How do I scale data in a neural network multilayer backpropagation?

 采纳的回答

You do not have to scale the data because variables are AUTOMATICALLY scaled to {-1,1} with MAPMINMAX by NEWFF, FITNET, PATTERNNET AND FEEDFORWARDNET.
The command
type feedforwardnet
yields these commands
========================
% Inputs
net.numInputs = 1;
net.inputConnect(1,1) = true;
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
% Outputs
net.outputConnect(Nl) = true;
net.outputs{Nl}.processFcns = {'removeconstantrows','mapminmax'};
===================================================
If you wish, you can replace either occurrance of MAPMINMAX with 'mapstd' (zero-mean/unit-variance) or 'none'.
I prefer to use MAPSTD before creating the net to try to understand the input/output relationships via plots, correlation coefficients and outliers.
Then I accept the automatic minmax normalization instead of removing or changing it.
Hope this helps.
Greg

2 个评论

Thanks Greg for ur reply. I want to build my program by own ( I dont use the toolbox). R there formula for scale?. Actually, i found this formula "I = Imin + (Imax-Imin)*(X-Dmin)/(Dmax-Dmin)" by searching in google but I dont know the reference it.
so I see old ur comment . You used this formula "xn = -1+ 2*(x-xmin)/(xmax-xmin) ;". is this formula for scalling? if yes, can I know the refrence? So what do you meant the numbers(1 &2) in formula.

请先登录,再进行评论。

更多回答(1 个)

1. Derive a linear tranformation xn(x) = a.*x + b such that
xn( x = min(x) ) = -1
xn( x = max(x) ) = +1
2. Derive a linear tranformation xn(x) = a.*x + b such that
mean(xn) = 0
var( xn ) = 1
Hope this helps.
Greg

1 个评论

sorry Greg, it is not clear. plz I need some details about this.

请先登录,再进行评论。

类别

帮助中心File Exchange 中查找有关 Deep Learning Toolbox 的更多信息

标签

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by