Why does Support Vector Regression (fitrsvm) optimization result in poor performance ?
6 次查看(过去 30 天)
显示 更早的评论
Hi
I am working on a numerical prediction problem (load forecasting). I have a predictor matrix which consists of around 20 numerical variables (X_train: including historical lags, calendar information, temperature etc) and an output vector of real/target load values (Load_train). The data consists around 10.000 points
I am following the below documentation on Support Vector Regression, in particular the section 'Optimize SVM Regression'. https://matlabacademy.mathworks.com/R2016b/portal.html?course=mlml#chapter=4&lesson=7§ion=3
However after the exhaustive 'hyperparameter optimization' I get poor prediction performance, especially compared to a simpler SVR which uses Gaussian Kernel, Kernel Scale 'auto' and all other parameters set to default without any optimization. I do standardize the input and output matrices before the training. Please find below the lines for two training procedure:
"Simple SVR"
mdl_simple=fitrsvm(X_train,Load_train,'Standardize',true,
'KernelFunction','gaussian','KernelScale','auto');
"Optimized SVR"
Mdl_optimized = fitrsvm(X,Y,'Standardize','true','OptimizeHyperparameters','auto',...
'HyperparameterOptimizationOptions',struct('AcquisitionFunctionName',
'expected-improvement-plus'))
Would anyone have any advice with regards to the optimization procedure or any ideas why optimized SVR might be giving worse results than simpler SVR?
Best regards
Baran
2 个评论
Walter Roberson
2017-1-6
Generally speaking, when more sophisticated fitting procedures end up giving worse results in practice, the problem can be due to overfitting.
antlhem
2021-5-29
Could take a look into my question? https://uk.mathworks.com/matlabcentral/answers/842800-why-matlab-svr-is-not-working-for-exponential-data-and-works-well-with-data-that-fluctuates?s_tid=prof_contriblnk
采纳的回答
Don Mathis
2017-1-8
编辑:Walter Roberson
2017-1-8
Maybe it's because the optimized SVR uses the default kernel function, which is 'linear'. If you include 'KernelFunction','gaussian' in your second call to fitrsvm, as you did in your first, it might perform better.
Secondarily, you could try running the optimization longer by adding the field 'MaxObjectiveEvaluations' to your struct:
Mdl_optimized = fitrsvm(X,Y,'KernelFunction','gaussian','Standardize',true,'OptimizeHyperparameters','auto',... 'HyperparameterOptimizationOptions',struct('AcquisitionFunctionName', 'expected-improvement-plus', 'MaxObjectiveEvaluations', 60))
2 个评论
Don Mathis
2017-3-23
I just noticed your reply. If you have some time on your hands you could try
Mdl_optimized = fitrsvm(X,Y,'OptimizeHyperparameters','all', 'HyperparameterOptimizationOptions',struct('MaxObjectiveEvaluations', Inf, 'SaveIntermediateResults',true))
This optimizes over 6 hyperparameters until you press Control-C, at which point you'll find an object called BayesoptResults in your workspace. The best hyperparameters are obtained using
bestPoint(BayesoptResults)
which you will then need to pass to a new call to fitrsvm manually.
更多回答(0 个)
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!