How does bayesian optimization and cross-validation work?
显示 更早的评论
Hello,
I was wondering how exactly the hyperparameter optimization works in this example: Example. The default setting is 5-fold cross-validation, but the output is a normal RegressionSVM and not a RegressionPartitionedSVM. That's how I understand the process, please give me feedback.
Let´s consider the first step of the hyperparameter optimization. The algorithm choses a initial hyperparameter setting and learns a model with 4/5 of the data. Now it evaluates the performance on the 1/5 of the data. What happens next? Is this hyperparameter setting used again and one model learned on another of the 4/5 data? After 5 iterations you now have 5 objectiv function values which are used for the calculation of the loss? This loss is the final loss for the first hyperparameter setting. This procedure is now repeated 30 times?
采纳的回答
更多回答(0 个)
类别
在 帮助中心 和 File Exchange 中查找有关 Support Vector Machine Regression 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!