supervised training of SOM in MATLAB
2 次查看(过去 30 天)
显示 更早的评论
I have a labelled data set, each data entry is of six dimensional. Each data entry is pre-labelled as belonging to one of 10 clusters.
I would like to train a SOM to fit this labelled data set. In other words, I would like to enforce a SOM that can exactly( or almost) cluster the same result to each data entry as the pre-labelled one.
Is there a function in the MATLAB neural network toolbox that can fulfill the above requirement?
0 个评论
采纳的回答
Greg Heath
2011-11-21
The "SO" in SOM means "Self-Organizing" and refers to using the Kohonen algorithm for UNSUPERVISED clustering. Do not use the acronym for supervised clustering.
Supervised clustering is called classification. Good classification algorithms do not usually restrict the number of clusters per class. They tend to create additional clusters to minimize overlapping clusters of different classes.
Kohonen's algorithms for supervised clustering (i.e., classification) are LVQ1 and LVQ2 and can be found in MATLAB's Neural Network Toolbox. I think a recommended initial configuration is that provided by SOM.
However, If you just want to minimize the misclassification rate, do not restrict the clusters to one per class and use NEWRB. You could try to limit the number of hidden nodes to 10. However, NEWRB may create 2 for one class before creating one for other classes.
NEWRB needs to be modified to accept an initial configuration of hidden nodes (cluster centers).
If you just want to minimize the misclassification rate and do not care about clusters, use NEWFF.
Hope this helps.
Greg
P.S. How much 6-dimensional data do you have
0 个评论
更多回答(1 个)
Amith Kamath
2011-11-17
I don't know if matlab natively includes SOM functions, but you should definitely find it here: http://www.cis.hut.fi/somtoolbox/
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Function Approximation, Clustering, and Control 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!