kfoldloss and regression machine learning like fitrsvm
1 view (last 30 days)
Show older comments
Hello,
I want to calculate the cross validation loss of my different regression machine learning models to compare them with each other. Therefor I want to use kfoldLoss, but I´m getting an error.
My code looks as follows:
%splitting data
[m,n] = size(Daten) ;
P = 0.8 ;
Training = Daten(1:round(P*m),:) ;
Testing = Daten(round(P*m)+1:end,:);
XTrain=Training(:,1:n-1);
YTrain=Training(:,n);
XTest=Testing(:,1:n-1);
YTest=Testing(:,n);
%Hyperparameter optimization
rng default
c = cvpartition(YTrain,'KFold',10);
Mdl = fitrsvm(XTrain,YTrain,'KernelFunction','gaussian','OptimizeHyperparameters','Epsilon',...
'HyperparameterOptimizationOptions',struct('AcquisitionFunctionName',...
'expected-improvement-plus','cvpartition',c));
L=kfoldLoss(Mdl)
I want to use this code structure for the different functions like fitrtree and the other regression functions in the bayesian optimization workflow. Why does kfoldLoss not work for this code?
Best regards, Dimitri
0 Comments
Answers (1)
Don Mathis
on 7 Nov 2018
When you call fitrsvm with 'OptimizeHyperparameters', the result is a single svm model, not a partitioned model with a kfoldLoss method. To get an estimate of the out-of-sample loss of your final model, you'll need to run the crossval function on it and then call kfoldLoss on the result of that:
pm = crossval(Mdl, 'cvpartition', c)
kfoldLoss(pm)
Or, since there's no need to reuse the cvpartition that you used for the optimization,
pm = crossval(Mdl, 'KFold', 10)
kfoldLoss(pm)
0 Comments
See Also
Categories
Find more on Support Vector Machine Regression in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!