How to count kfoldloss error from ClassificationLinear?

1 view (last 30 days)
cv = cvpartition(numel(y_trainUndersampled),'Kfold',5);
hyperOpt = struct('AcquisitionFunctionName','expected-improvement-plus',...
'Optimizer','bayesopt','MaxObjectiveEvaluations', 100,...
'CVPartition', cv);
bestLogsMdl = fitclinear(X_trainUndersampled, y_trainUndersampled,...
'Learner', 'logistic',...
'OptimizeHyperparameters',{'Lambda','Regularization'},...
'HyperparameterOptimizationOptions',hyperOpt,...
'ScoreTransform','logit');
Hi, I have used hyperparameter optimization on fitclinear function. The code above produces bestLogsMdl as ClassificationLinear.
I want to use ClassificationLinear to count the kfoldLoss.
However based on the documentation in https://uk.mathworks.com/help/stats/fitclinear.html#bu5mw4p , kfoldLoss is used on ClassificationPartitionedLinear
How to use hyperparameter optimization with fitclinear together on the kfoldLoss? What modifications are needed on the fitclinear so it would produce ClassificationPartitionedLinear?
My ultimate goal is to plot a misclassification rate vs number of learning cycles graph

Accepted Answer

Walter Roberson
Walter Roberson on 8 Dec 2020
Edited: Walter Roberson on 10 Dec 2020
You cannot use any cross-validation name-value pair argument along with the 'OptimizeHyperparameters' name-value pair argument. You can modify the cross-validation for 'OptimizeHyperparameters' only by using the 'HyperparameterOptimizationOptions' name-value pair argument.
So you need to get rid of OptimizeHyperParameters and set appropriate HyperparameterOptimizationOptions

More Answers (0)

Products


Release

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!