Optimal (linear) combination of several binary probabilistic classifiers
1 次查看(过去 30 天)
显示 更早的评论
Hello together,
Based on an EEG training set, I have trained several binary probabilistic classifiers (logistic regression) that all try to predict in an independent EEG test set whether a person currently things about a movement (class 1) or not (class 2). All classifiers work above chance level, however they are all far from being perfect. I’m now wondering whether it is somehow possible to optimally combine my classifier’s outputs, in order to obtain only one probability value per trial, however that is more reliable than my individual classifier outputs? My intuitive feeling is that the solution to my problem has something to do with “ensemble methods” (https://en.wikipedia.org/wiki/Ensemble_learning), however, unfortunately I am a bit overwhelmed with all the methods and algorithms being available (and unfortunately my knowledge about machine learning is quite limited).
Could perhaps anyone just give me some advice what method would be most suitable for my purpose (optimal linear combination of different binary probabilistic classifiers) and how I can easily implement it. I’m wondering whether a simple linear regression model would perhaps already do the job for me (however would be probably not the best solution, right)?
I would be very very thankful for any help. Cheers
0 个评论
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 EEG/MEG/ECoG 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!