Optimization and Machine Learning

8 次查看(过去 30 天)
Greg
Greg 2014-12-10
评论: Greg 2014-12-10
If you're able to create a machine learning classification algorithm (such as a boosted Ensemble Learning techniques for regression or classification) to create a model, is it then possible to optimize the predicted response around that model?
For instance, say I create a classification model from 1000 examples (rows) and 70 features (columns) to predict a binary classification response. It's simple to then manually create a hypothetical 1001st example and predict the class to which it will belong.
I would like to be able to define & fix some of those 70 features, (let's say 5) while allowing others to fluctuate. Is there a way to do this, and then allow an optimization algorithm to optimize the remaining 65 features, such that I get the optimal combination of features to maximize the likelihood of achieving a given classification?
On the surface, it seems like the Optimization Toolbox would provide this functionality, but I don't know if its possible to define a machine learning model in the optimization toolbox.
Thanks.

回答(1 个)

Sean de Wolski
Sean de Wolski 2014-12-10
Sequential feature selection is what it sounds like you're looking for.
doc sequentialfs
  1 个评论
Greg
Greg 2014-12-10
I don't think so. Sequential feature selection looks to be a way to minimize the number of variables required to achieve optimal predictive capability of a model. I'm looking for a way to substitute values for features back into a model to generate the optimal combination of feature values that yields a target response.

请先登录,再进行评论。

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by