Main Content

crossval

Class: ClassificationTree

Cross-validated decision tree

Syntax

cvmodel = crossval(model)
cvmodel = crossval(model,Name,Value)

Description

cvmodel = crossval(model) creates a partitioned model from model, a fitted classification tree. By default, crossval uses 10-fold cross validation on the training data to create cvmodel.

cvmodel = crossval(model,Name,Value) creates a partitioned model with additional options specified by one or more Name,Value pair arguments.

Input Arguments

expand all

model

A classification model, produced using fitctree.

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

Cross-validation partition, specified as the comma-separated pair consisting of 'CVPartition' and a cvpartition object created by the cvpartition function. crossval splits the data into subsets with cvpartition.

Use only one of these four options at a time: 'CVPartition', 'Holdout', 'KFold', or 'Leaveout'.

Fraction of the data used for holdout validation, specified as the comma-separated pair consisting of 'Holdout' and a scalar value in the range (0,1).

Use only one of these four options at a time: 'CVPartition', 'Holdout', 'KFold', or 'Leaveout'.

Example: 'Holdout',0.3

Data Types: single | double

Number of folds to use in a cross-validated model, specified as the comma-separated pair consisting of 'KFold' and a positive integer value greater than 1.

Use only one of these four options at a time: 'CVPartition', 'Holdout', 'KFold', or 'Leaveout'.

Example: 'KFold',3

Data Types: single | double

Leave-one-out cross-validation flag, specified as the comma-separated pair consisting of 'Leaveout' and 'on' or 'off'. Leave-one-out is a special case of 'KFold' in which the number of folds equals the number of observations.

Use only one of these four options at a time: 'CVPartition', 'Holdout', 'KFold', or 'Leaveout'.

Example: 'Leaveout','on'

Output Arguments

expand all

Partitioned model, returned as a ClassificationPartitionedModel object.

Examples

expand all

Create a classification model for the ionosphere data, then create a cross-validation model. Evaluate the quality the model using kfoldLoss.

load ionosphere
tree = fitctree(X,Y);
cvmodel = crossval(tree);
L = kfoldLoss(cvmodel)
L = 0.1083

Tips

  • Assess the predictive performance of model on cross-validated data using the “kfold” methods and properties of cvmodel, such as kfoldLoss.

Alternatives

You can create a cross-validation tree directly from the data, instead of creating a decision tree followed by a cross-validation tree. To do so, include one of these five options in fitctree: 'CrossVal', 'KFold', 'Holdout', 'Leaveout', or 'CVPartition'.

Extended Capabilities

See Also

|