Neural Network Normalization process
显示 更早的评论
Hello all,
I have a question regarding the NN normalization procedure. When a NN is trained using the train(net,x,y) command the function somehow normalizes x and y in order to ensure that the weights and biases of the network are bound by [-1,1].
Currently I am in the process of trying to apply a set of NN weights and biases analytically (instead of just calling net(xtest)) using the following equation:
ytest = Outputbias+Hiddenweight*tanh(Inputbias + Inputweight*xtest);
which produces an output -- however it does not produce the same output as (ytest = net(xtest)).
I'm assuming the difference is due to the fact that xtest is not normalized before using the above equation.
I tried to simply divide xtest by its maximum before feeding it in to the equation however the results still differ.
Does anyone know how xtest should be manipulated in order to produce the same output as net(xtest).
Thanks! Bryan
采纳的回答
更多回答(0 个)
类别
在 帮助中心 和 File Exchange 中查找有关 Deep Learning Toolbox 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!