Main Content

BayesianOptimization

Bayesian optimization results

Description

A BayesianOptimization object contains the results of a Bayesian optimization. It is the output of bayesopt or a fit function that accepts the OptimizeHyperparameters name-value pair such as fitcdiscr. In addition, a BayesianOptimization object contains data for each iteration of bayesopt that can be accessed by a plot function or an output function.

Creation

Create a BayesianOptimization object by using the bayesopt function or one of the following fit functions with the OptimizeHyperparameters name-value argument.

Properties

expand all

Problem Definition Properties

This property is read-only.

ObjectiveFcn argument used by bayesopt, specified as a function handle.

  • If you call bayesopt directly, ObjectiveFcn is the bayesopt objective function argument.

  • If you call a fit function containing the 'OptimizeHyperparameters' name-value pair argument, ObjectiveFcn is a function handle that returns the misclassification rate for classification or returns the logarithm of one plus the cross-validation loss for regression, measured by five-fold cross-validation.

Data Types: function_handle

This property is read-only.

VariableDescriptions argument that bayesopt used, specified as a vector of optimizableVariable objects.

  • If you called bayesopt directly, VariableDescriptions is the bayesopt variable description argument.

  • If you called a fit function with the OptimizeHyperparameters name-value pair, VariableDescriptions is the vector of hyperparameters.

This property is read-only.

Options that bayesopt used, specified as a structure.

  • If you called bayesopt directly, Options is the options used in bayesopt, which are the name-value pairs See bayesopt Input Arguments.

  • If you called a fit function with the OptimizeHyperparameters name-value pair, Options are the default bayesopt options, modified by the HyperparameterOptimizationOptions name-value pair.

Options is a read-only structure containing the following fields.

Option NameMeaning
AcquisitionFunctionNameAcquisition function name. See Acquisition Function Types.
IsObjectiveDeterministictrue means the objective function is deterministic, false otherwise.
ExplorationRatioUsed only when AcquisitionFunctionName is 'expected-improvement-plus' or 'expected-improvement-per-second-plus'. See Plus.
  
MaxObjectiveEvaluationsObjective function evaluation limit.
MaxTimeTime limit.
  
XConstraintFcnDeterministic constraints on variables. See Deterministic Constraints — XConstraintFcn.
ConditionalVariableFcnConditional variable constraints. See Conditional Constraints — ConditionalVariableFcn.
NumCoupledConstraintsNumber of coupled constraints. See Coupled Constraints.
CoupledConstraintTolerancesCoupled constraint tolerances. See Coupled Constraints.
AreCoupledConstraintsDeterministicLogical vector specifying whether each coupled constraint is deterministic.
  
VerboseCommand-line display level.
OutputFcnFunction called after each iteration. See Bayesian Optimization Output Functions.
SaveVariableNameVariable name for the @assignInBase output function.
SaveFileNameFile name for the @saveToFile output function.
PlotFcnPlot function called after each iteration. See Bayesian Optimization Plot Functions
  
InitialXPoints where bayesopt evaluated the objective function.
InitialObjectiveObjective function values at InitialX.
InitialConstraintViolationsCoupled constraint function values at InitialX.
InitialErrorValuesError values at InitialX.
InitialObjectiveEvaluationTimesObjective function evaluation times at InitialX.
InitialIterationTimesTime for each iteration, including objective function evaluation and other computations.

Data Types: struct

Solution Properties

This property is read-only.

Minimum observed value of objective function, specified as a real scalar. When there are coupled constraints or evaluation errors, this value is the minimum over all observed points that are feasible according to the final constraint and Error models.

Data Types: double

This property is read-only.

Observed point with minimum objective function value, specified as a 1-by-D table, where D is the number of variables.

Data Types: table

This property is read-only.

Estimated objective function value at XAtMinEstimatedObjective, specified as a real scalar.

MinEstimatedObjective is the mean value of the posterior distribution of the final objective model. The software estimates the MinEstimatedObjective value by passing XAtMinEstimatedObjective to the object function predictObjective.

Data Types: double

This property is read-only.

Point with the minimum upper confidence bound of the objective function value among the visited points, specified as a 1-by-D table, where D is the number of variables. The software uses the final objective model to find the upper confidence bounds of the visited points.

XAtMinEstimatedObjective is the same as the best point returned by the bestPoint function with the default criterion ('min-visited-upper-confidence-interval').

Data Types: table

This property is read-only.

Number of objective function evaluations, specified as a positive integer. This includes the initial evaluations to form a posterior model as well as evaluation during the optimization iterations.

Data Types: double

This property is read-only.

Total elapsed time of optimization in seconds, specified as a positive scalar.

Data Types: double

This property is read-only.

Next point to evaluate if optimization continues, specified as a 1-by-D table, where D is the number of variables.

Data Types: table

Trace Properties

This property is read-only.

Points where the objective function was evaluated, specified as a T-by-D table, where T is the number of evaluation points and D is the number of variables.

Data Types: table

This property is read-only.

Objective function values, specified as a column vector of length T, where T is the number of evaluation points. ObjectiveTrace contains the history of objective function evaluations.

Data Types: double

This property is read-only.

Objective function evaluation times, specified as a column vector of length T, where T is the number of evaluation points. ObjectiveEvaluationTimeTrace includes the time in evaluating coupled constraints, because the objective function computes these constraints.

Data Types: double

This property is read-only.

Iteration times, specified as a column vector of length T, where T is the number of evaluation points. IterationTimeTrace includes both objective function evaluation time and other overhead.

Data Types: double

This property is read-only.

Coupled constraint values, specified as a T-by-K array, where T is the number of evaluation points and K is the number of coupled constraints.

Data Types: double

This property is read-only.

Error indications, specified as a column vector of length T of -1 or 1 entries, where T is the number of evaluation points. Each 1 entry indicates that the objective function errored or returned NaN on the corresponding point in XTrace. Each -1 entry indicates that the objective function value was computed.

Data Types: double

This property is read-only.

Feasibility indications, specified as a logical column vector of length T, where T is the number of evaluation points. Each 1 entry indicates that the final constraint model predicts feasibility at the corresponding point in XTrace.

Data Types: logical

This property is read-only.

Probability that evaluation point is feasible, specified as a column vector of length T, where T is the number of evaluation points. The probabilities come from the final constraint model, including the error constraint model, on the corresponding points in XTrace.

Data Types: double

This property is read-only.

Which evaluation gave minimum feasible objective, specified as a column vector of integer indices of length T, where T is the number of evaluation points. Feasibility is determined with respect to the constraint models that existed at each iteration, including the error constraint model.

Data Types: double

This property is read-only.

Minimum observed objective, specified as a column vector of length T, where T is the number of evaluation points.

Data Types: double

This property is read-only.

Estimated objective, specified as a column vector of length T, where T is the number of evaluation points. The estimated objective at each iteration is determined with respect to the objective model at that iteration. At each iteration, the software uses the object function predictObjective to estimate the objective function value at the point with the minimum upper confidence bound of the objective function among the visited points.

Data Types: double

This property is read-only.

Auxiliary data from the objective function, specified as a cell array of length T, where T is the number of evaluation points. Each entry in the cell array is the UserData returned in the third output of the objective function.

Data Types: cell

Object Functions

bestPointBest point in a Bayesian optimization according to a criterion
plotPlot Bayesian optimization results
predictConstraintsPredict coupled constraint violations at a set of points
predictErrorPredict error value at a set of points
predictObjectivePredict objective function at a set of points
predictObjectiveEvaluationTimePredict objective function run times at a set of points
resumeResume a Bayesian optimization

Examples

collapse all

This example shows how to create a BayesianOptimization object by using bayesopt to minimize cross-validation loss.

Optimize hyperparameters of a KNN classifier for the ionosphere data, that is, find KNN hyperparameters that minimize the cross-validation loss. Have bayesopt minimize over the following hyperparameters:

  • Nearest-neighborhood sizes from 1 to 30

  • Distance functions 'chebychev', 'euclidean', and 'minkowski'.

For reproducibility, set the random seed, set the partition, and set the AcquisitionFunctionName option to 'expected-improvement-plus'. To suppress iterative display, set 'Verbose' to 0. Pass the partition c and fitting data X and Y to the objective function fun by creating fun as an anonymous function that incorporates this data. See Parameterizing Functions.

load ionosphere
rng default
num = optimizableVariable('n',[1,30],'Type','integer');
dst = optimizableVariable('dst',{'chebychev','euclidean','minkowski'},'Type','categorical');
c = cvpartition(351,'Kfold',5);
fun = @(x)kfoldLoss(fitcknn(X,Y,'CVPartition',c,'NumNeighbors',x.n,...
    'Distance',char(x.dst),'NSMethod','exhaustive'));
results = bayesopt(fun,[num,dst],'Verbose',0,...
    'AcquisitionFunctionName','expected-improvement-plus')

Figure contains an axes object. The axes object with title Objective function model contains 5 objects of type line, surface, contour. These objects represent Observed points, Model mean, Next point, Model minimum feasible.

Figure contains an axes object. The axes object with title Min objective vs. Number of function evaluations contains 2 objects of type line. These objects represent Min observed objective, Estimated min objective.

results = 
  BayesianOptimization with properties:

                      ObjectiveFcn: [function_handle]
              VariableDescriptions: [1x2 optimizableVariable]
                           Options: [1x1 struct]
                      MinObjective: 0.1197
                   XAtMinObjective: [1x2 table]
             MinEstimatedObjective: 0.1213
          XAtMinEstimatedObjective: [1x2 table]
           NumObjectiveEvaluations: 30
                  TotalElapsedTime: 55.3628
                         NextPoint: [1x2 table]
                            XTrace: [30x2 table]
                    ObjectiveTrace: [30x1 double]
                  ConstraintsTrace: []
                     UserDataTrace: {30x1 cell}
      ObjectiveEvaluationTimeTrace: [30x1 double]
                IterationTimeTrace: [30x1 double]
                        ErrorTrace: [30x1 double]
                  FeasibilityTrace: [30x1 logical]
       FeasibilityProbabilityTrace: [30x1 double]
               IndexOfMinimumTrace: [30x1 double]
             ObjectiveMinimumTrace: [30x1 double]
    EstimatedObjectiveMinimumTrace: [30x1 double]

This example shows how to minimize the cross-validation loss in the ionosphere data using Bayesian optimization of an SVM classifier.

Load the data.

load ionosphere

Optimize the classification using the 'auto' parameters.

rng default % For reproducibility
Mdl = fitcsvm(X,Y,'OptimizeHyperparameters','auto')
|=====================================================================================================|
| Iter | Eval   | Objective   | Objective   | BestSoFar   | BestSoFar   | BoxConstraint|  KernelScale |
|      | result |             | runtime     | (observed)  | (estim.)    |              |              |
|=====================================================================================================|
|    1 | Best   |     0.20513 |      18.569 |     0.20513 |     0.20513 |       64.836 |    0.0015729 |
|    2 | Accept |     0.35897 |      0.2057 |     0.20513 |     0.21471 |     0.036335 |       5.5755 |
|    3 | Best   |     0.13105 |      6.2333 |     0.13105 |     0.14133 |    0.0022147 |    0.0023957 |
|    4 | Accept |     0.35897 |     0.29754 |     0.13105 |     0.13109 |       5.1259 |        98.62 |
|    5 | Best   |     0.12536 |     0.58088 |     0.12536 |     0.12538 |    0.0010035 |     0.022328 |
|    6 | Accept |     0.13105 |     0.40129 |     0.12536 |     0.12624 |    0.0010683 |     0.010111 |
|    7 | Accept |     0.13105 |     0.35759 |     0.12536 |     0.12546 |    0.0010192 |     0.010237 |
|    8 | Accept |     0.13105 |      6.2978 |     0.12536 |     0.12544 |     0.083367 |     0.014339 |
|    9 | Accept |      0.1339 |      14.018 |     0.12536 |     0.12546 |    0.0010021 |    0.0010244 |
|   10 | Best   |     0.12251 |     0.25271 |     0.12251 |      0.1226 |    0.0010657 |     0.043944 |
|   11 | Accept |     0.12536 |     0.52661 |     0.12251 |     0.12273 |    0.0045194 |     0.043634 |
|   12 | Accept |     0.12821 |     0.64736 |     0.12251 |     0.12506 |    0.0010123 |      0.03649 |
|   13 | Accept |     0.12536 |     0.63727 |     0.12251 |     0.12511 |    0.0040337 |     0.031924 |
|   14 | Accept |      0.1339 |     0.34713 |     0.12251 |     0.12472 |    0.0010317 |     0.080017 |
|   15 | Accept |     0.12251 |     0.22915 |     0.12251 |     0.12414 |     0.003321 |     0.033169 |
|   16 | Accept |     0.12536 |     0.34806 |     0.12251 |     0.12439 |    0.0047749 |     0.034094 |
|   17 | Accept |     0.35897 |     0.44574 |     0.12251 |     0.12436 |       0.3106 |       994.65 |
|   18 | Accept |     0.35897 |     0.25864 |     0.12251 |      0.1244 |    0.0017553 |        54.93 |
|   19 | Accept |     0.35897 |     0.37266 |     0.12251 |      0.1247 |    0.0010187 |       1.0613 |
|   20 | Accept |     0.12251 |     0.35001 |     0.12251 |     0.12379 |    0.0084324 |     0.047095 |
|=====================================================================================================|
| Iter | Eval   | Objective   | Objective   | BestSoFar   | BestSoFar   | BoxConstraint|  KernelScale |
|      | result |             | runtime     | (observed)  | (estim.)    |              |              |
|=====================================================================================================|
|   21 | Accept |     0.13105 |     0.54292 |     0.12251 |      0.1245 |     0.023108 |     0.049961 |
|   22 | Accept |     0.12251 |      0.2933 |     0.12251 |      0.1237 |    0.0010439 |     0.042774 |
|   23 | Accept |     0.12251 |     0.35466 |     0.12251 |     0.12342 |    0.0010425 |     0.043785 |
|   24 | Accept |      0.2906 |     0.28345 |     0.12251 |     0.12346 |       1.9994 |       20.839 |
|   25 | Accept |     0.12821 |     0.25808 |     0.12251 |     0.12435 |    0.0010389 |     0.046026 |
|   26 | Accept |     0.12251 |     0.32151 |     0.12251 |     0.12401 |     0.057129 |      0.40374 |
|   27 | Accept |     0.13105 |     0.57236 |     0.12251 |     0.12339 |       1.7599 |      0.44056 |
|   28 | Accept |     0.12536 |     0.43678 |     0.12251 |     0.12338 |      0.12534 |       0.2241 |
|   29 | Accept |     0.12821 |     0.26761 |     0.12251 |     0.12437 |       0.2482 |       0.4511 |
|   30 | Accept |     0.12821 |     0.23902 |     0.12251 |     0.12467 |      0.11675 |      0.31759 |

Figure contains an axes object. The axes object with title Min objective vs. Number of function evaluations contains 2 objects of type line. These objects represent Min observed objective, Estimated min objective.

Figure contains an axes object. The axes object with title Objective function model contains 5 objects of type line, surface, contour. These objects represent Observed points, Model mean, Next point, Model minimum feasible.

__________________________________________________________
Optimization completed.
MaxObjectiveEvaluations of 30 reached.
Total function evaluations: 30
Total elapsed time: 101.7799 seconds
Total objective function evaluation time: 54.9466

Best observed feasible point:
    BoxConstraint    KernelScale
    _____________    ___________

      0.0010657       0.043944  

Observed objective function value = 0.12251
Estimated objective function value = 0.12467
Function evaluation time = 0.25271

Best estimated feasible point (according to models):
    BoxConstraint    KernelScale
    _____________    ___________

      0.0010657       0.043944  

Estimated objective function value = 0.12467
Estimated function evaluation time = 0.32358
Mdl = 
  ClassificationSVM
                         ResponseName: 'Y'
                CategoricalPredictors: []
                           ClassNames: {'b'  'g'}
                       ScoreTransform: 'none'
                      NumObservations: 351
    HyperparameterOptimizationResults: [1x1 BayesianOptimization]
                                Alpha: [112x1 double]
                                 Bias: -3.2345
                     KernelParameters: [1x1 struct]
                       BoxConstraints: [351x1 double]
                      ConvergenceInfo: [1x1 struct]
                      IsSupportVector: [351x1 logical]
                               Solver: 'SMO'


  Properties, Methods

The fit achieved about 12% loss for the default 5-fold cross validation.

Examine the BayesianOptimization object that is returned in the HyperparameterOptimizationResults property of the returned model.

disp(Mdl.HyperparameterOptimizationResults)
  BayesianOptimization with properties:

                      ObjectiveFcn: @createObjFcn/inMemoryObjFcn
              VariableDescriptions: [5x1 optimizableVariable]
                           Options: [1x1 struct]
                      MinObjective: 0.1225
                   XAtMinObjective: [1x2 table]
             MinEstimatedObjective: 0.1247
          XAtMinEstimatedObjective: [1x2 table]
           NumObjectiveEvaluations: 30
                  TotalElapsedTime: 101.7799
                         NextPoint: [1x2 table]
                            XTrace: [30x2 table]
                    ObjectiveTrace: [30x1 double]
                  ConstraintsTrace: []
                     UserDataTrace: {30x1 cell}
      ObjectiveEvaluationTimeTrace: [30x1 double]
                IterationTimeTrace: [30x1 double]
                        ErrorTrace: [30x1 double]
                  FeasibilityTrace: [30x1 logical]
       FeasibilityProbabilityTrace: [30x1 double]
               IndexOfMinimumTrace: [30x1 double]
             ObjectiveMinimumTrace: [30x1 double]
    EstimatedObjectiveMinimumTrace: [30x1 double]
Introduced in R2016b