Forecasting Volatility in Financial Markets! By Introducing a GA-Assisted SVR-Garch Model

Year
2012
Type(s)
Author(s)
Ali Habibnia
Source
Master thesis, Available at SSRN
Url
https://ssrn.com/abstract=2144922
BibTeX
BibTeX

In recent years, support vector regressions (SVRs), a novel artificial neural network (ANN) technique, has been successfully used as a nonparametric tool for regression estimation and forecasting time series data. In this thesis, we deal with the application of SVRs in financial markets volatility forecasting.

An accurate forecast of volatility is essential to estimate the value of market risk and it is one of the primary inputs to a wide range of financial applications from risk measurement such as value at risk to asset and financial derivatives pricing. Many researchers use GARCH models to generate volatility forecasts and these models are usually estimated using maximum likelihood (ML) procedures, assuming that the data are normally distributed. In this thesis, we will show that GARCH models can be estimated using SVRs and that such estimates have a higher predicting ability than those obtained via common ML methods.

But when using SVRs, two problems are faced: how to choose the optimal input feature subset for SVRs, and how to set the best kernel parameters. Since no structured way is available to choose the free parameters of SVR and kernel function, these parameters are usually set by the researcher in trial and error procedure which is not optimal.

In this thesis, a novel method, named as GA assisted SVR has been introduced, which a genetic algorithm simultaneously searches for SVR’s optimal parameters and kernel parameter (in this study: a radial basis function (RBF)).

Based on this optimized radial basis SVR, a GARCH model is proposed and is compared with a parametric GARCH in terms of their ability to forecast FTSE100 Index return volatility. The experiment shows that the GA assisted SVR-GARCH significantly outperforms the parametric GARCH based on the criteria of mean absolute error (MAE) and mean squared error (MSE).

Leave a Reply

Your email address will not be published. Required fields are marked *