# cvloss

**Class: **RegressionTree

Regression error by cross validation

## Syntax

`E = cvloss(tree)`

[E,SE] =
cvloss(tree)

[E,SE,Nleaf]
= cvloss(tree)

[E,SE,Nleaf,BestLevel]
= cvloss(tree)

[E,...] = cvloss(tree,Name,Value)

## Description

returns the cross-validated regression error (loss) for `E`

= cvloss(`tree`

)`tree`

, a
regression tree.

`[`

returns the standard error of `E`

,`SE`

] =
cvloss(`tree`

)`E`

.

`[`

returns the number of leaves (terminal nodes) in `E`

,`SE`

,`Nleaf`

]
= cvloss(`tree`

)`tree`

.

`[`

returns the optimal pruning level for `E`

,`SE`

,`Nleaf`

,`BestLevel`

]
= cvloss(`tree`

)`tree`

.

`[`

cross validates with additional options specified by one or more
`E`

,...] = cvloss(`tree`

,`Name,Value`

)`Name,Value`

pair arguments. You can specify several name-value
pair arguments in any order as `Name1,Value1,…,NameN,ValueN`

.

## Input Arguments

## Output Arguments

## Examples

## Alternatives

You can construct a cross-validated tree model with `crossval`

, and call `kfoldLoss`

instead of `cvloss`

. If you are
going to examine the cross-validated tree more than once, then the alternative can save
time.

However, unlike `cvloss`

, `kfoldLoss`

does not return `SE`

,
`Nleaf`

, or `BestLevel`

.