VOLUME 3 (2003) 1–10
INSTITUTE OF PHYSICS PUBLISHING
Estimating GARCH models using
support vector machines*
, Julio A Afonso-Rodr´
Department of Signal Theory and Communications, University Carlos III,
Legan´es, 28911 Madrid, Spain
Department of Institutional Economics, Economic Statistics and
Econometrics, University of La Laguna, 38071 Tenerife, Canary
Department of Financial Economy and Accounting, University of La
Laguna, 38071 Tenerife, Canary Islands, Spain
E-mail: firstname.lastname@example.org, email@example.com and firstname.lastname@example.org
Received 17 February 2002, in ±nal form 20 February 2003
Online at stacks.iop.org/Quant/3
Support vector machines (SVMs) are a new nonparametric tool for regression
estimation. We will use this tool to estimate the parameters of a GARCH
model for predicting the conditional volatility of stock market returns.
GARCH models are usually estimated using maximum likelihood (ML)
procedures, assuming that the data are normally distributed. In this paper, we
will show that GARCH models can be estimated using SVMs and that such
estimates have a higher predicting ability than those obtained via common
Financial returns series are mainly characterized by having
a zero mean, exhibiting high kurtosis and little, if any,
The squares of these returns often present
high correlation and persistence, which makes ARCH-type
models suitable for estimating the conditional volatility of
such processes; see Engle (1982) for the seminal work,
(1994) for a survey on volatility models and
Engle and Patton (2001) for several extensions. The ARCH
parameters are usually estimated using maximum likelihood
(ML) procedures that are optimal when the data is drawn from
a Gaussian distribution.
Support vector machines (SVMs) are state-of-the-art tools
for linear and nonlinear input–output knowledge discovery
(Vapnik 1998, Sch¨olkopf and Smola 2001).
be employed for solving pattern recognition and regression
* Paper presented at Applications of Physics in Financial Analysis (APFA)
3, 5–7 December 2001, Museum of London, UK.
SVMs have been developed in the
machine learning community and resemble, in some ways, a
neural network (NN). But SVMs are superior to most common
NNs (such as multi-layered perceptron or radial basis function
networks) due to the SVM optimization procedure giving not
only the weights of the network but also its architecture.
Furthermore, one of the most desirable properties when using a
SVM is that its optimizing functional is quadratic and linearly
restricted, meaning that it only presents a single minimum
without any local undesirable solutions.