Main Content

Support vector machine template

returns
a support vector machine (SVM) learner template suitable for training
error-correcting output code (ECOC) multiclass models.`t`

= templateSVM()

If you specify a default template, then the software uses default values for all input arguments during training.

Specify `t`

as a binary learner, or one in a set of binary learners, in
`fitcecoc`

to train an ECOC multiclass
classifier.

returns
a template with additional options specified by one or more name-value
pair arguments.`t`

= templateSVM(`Name,Value`

)

For example, you can specify the box constraint, the kernel function, or whether to standardize the predictors.

If you display `t`

in the Command Window, then
all options appear empty (`[]`

), except those that
you specify using name-value pair arguments. During training, the
software uses default values for empty options.

By default and for efficiency, `fitcecoc`

empties the `Alpha`

, `SupportVectorLabels`

,
and `SupportVectors`

properties
for all linear SVM binary learners. `fitcecoc`

lists `Beta`

, rather than
`Alpha`

, in the model display.

To store `Alpha`

, `SupportVectorLabels`

, and
`SupportVectors`

, pass a linear SVM template that specifies storing
support vectors to `fitcecoc`

. For example,
enter:

t = templateSVM('SaveSupportVectors',true) Mdl = fitcecoc(X,Y,'Learners',t);

You can remove the support vectors and related values by passing the resulting
`ClassificationECOC`

model to
`discardSupportVectors`

.

[1] Christianini, N., and J. C. Shawe-Taylor. *An
Introduction to Support Vector Machines and Other Kernel-Based Learning
Methods*. Cambridge, UK: Cambridge University Press, 2000.

[2] Fan, R.-E., P.-H. Chen, and C.-J. Lin.
“Working set selection using second order information for training
support vector machines.” *Journal of Machine Learning
Research*, Vol 6, 2005, pp. 1889–1918.

[3] Hastie, T., R. Tibshirani, and J. Friedman. *The
Elements of Statistical Learning*, Second Edition. NY:
Springer, 2008.

[4] Kecman V., T. -M. Huang, and M. Vogt.
“Iterative Single Data Algorithm for Training Kernel Machines
from Huge Data Sets: Theory and Performance.” In *Support
Vector Machines: Theory and Applications*. Edited by Lipo
Wang, 255–274. Berlin: Springer-Verlag, 2005.

[5] Scholkopf, B., J. C. Platt, J. C. Shawe-Taylor,
A. J. Smola, and R. C. Williamson. “Estimating the Support
of a High-Dimensional Distribution.” *Neural Comput*.,
Vol. 13, Number 7, 2001, pp. 1443–1471.

[6] Scholkopf, B., and A. Smola. *Learning with
Kernels: Support Vector Machines, Regularization, Optimization and
Beyond, Adaptive Computation and Machine Learning*. Cambridge,
MA: The MIT Press, 2002.