Neural net visualisation
显示 更早的评论
I'm able to produce neural nets that work but I want to know what the resulting network looks like. Ideally some sort of graph with edges for each neuron showing their weights. If this does not exist, could someone give me a description of the network object output that will let me find and understand the weights myself?
采纳的回答
更多回答(2 个)
Greg Heath
2011-11-4
0 个投票
You can print the weights. However, it is generally
impossible to understand the weights of a regression
MLP for a medium sized RW problem
On the other hand, the weights of a classification
RBF or EBF are more readily understood.
Moreover, once you get all the interclass distances you can use
multidimensional scaling (MDS) to get a 2-D visualization. The
form of MDS that I have used is called Sammon Mapping. You
will have to Google to find code.
Hope this helps.
Greg
P.S. I have never used a weight graph. However, you may be able to find code on the internet.
John Dickson
2011-11-4
Thanks, I am looking into your suggestions. It may well be that it is not possible to understand medium-sized nets but it would really help me to understand what's going on if you might explain a simple net. The classic XOR problem can be solved by two and gates and an or: I.e. [A and ~B] or [B and ~A]. Using the code:
input = [0 0; 0 1; 1 0; 1 1]'; %'# each column is an input vector
ouputActual = [0 1 1 0];
net = newpr(input, ouputActual, 3); %# 1 hidden layer with 3 neurons
net.divideFcn = ''; %# use the entire input for training
net = init(net); %# init
[net,tr] = train(net, input, ouputActual); %# train
outputPredicted = sim(net, input); %# predict
The solution net has input weights:
net.iw{1} =
-1.6875 2.0152
-2.3719 -1.3529
1.5600 2.8228
Layer weights:
net.lw{2} =
-2.6606 3.7609 3.7860
And biases:
net.b{1} =
2.6314
1.3409
2.0613
net.b{2} = -1.2402
This is very confusing. Do you know the simplest net that will solve this problem? Is it possible to constrain the optimisation so that the simplest solution is found?
类别
在 帮助中心 和 File Exchange 中查找有关 Define Shallow Neural Network Architectures 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!