objective function in Bayesian Optimization Algorithm like fitrsvm and fitrgp
6 views (last 30 days)
Show older comments
Hello,
What is the mathematical objective function in the bayesian optimization algorithm? The explanation says that the algorithm like fitrsvm tries to minimize the log(1 + cross-validation loss) but what is the real mathematical formula?
Is it possible to change the objective function to just the MSE?
Thank you!
Dimitri
0 Comments
Accepted Answer
Don Mathis
on 13 May 2019
This page says that the loss defaults to MSE. So that's the loss that's used in the log(1+cvloss) formula. Cross validated loss is the loss summed over all the held-out validation sets. The default when using optimization is 5-fold cross-validation.
There's not an option to change the hyperparameter optimization objective function from log(1+cvloss). You would need to edit the source code to do that. The source file is matlab\toolbox\stats\classreg\+classreg\+learning\+paramoptim\createObjFcn.m. Look for the call to the log1p function.
3 Comments
Don Mathis
on 14 May 2019
Edited: Don Mathis
on 14 May 2019
Because loss(Mdl,X,Y) is the loss of the final model on the full dataset, while the MinObjective is the log of 1 plus the out-of-sample cross-validated loss. See the kfoldLoss method for documentation of that. If you used 5-fold cross-validation, the kfoldLoss is the summed loss of 5 different models, each on 1/5 of the dataset. It is not the loss of the final model on the full dataset.
antlhem
on 29 May 2021
Hi, Could take a look into my question? https://uk.mathworks.com/matlabcentral/answers/842800-why-matlab-svr-is-not-working-for-exponential-data-and-works-well-with-data-that-fluctuates?s_tid=prof_contriblnk
More Answers (0)
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!