Analytic expressions for predictive distributions in mixture autoregressive models

https://doi.org/10.1016/j.spl.2009.04.009Get rights and content

Abstract

We show that the distributions of the multi-step predictors in mixture autoregressive models are also mixtures and specify them analytically. In the case of mixtures of Gaussian or stable distributions the multi-step distributions can be obtained by simple arithmetic manipulation on components’ parameters.

Introduction

Mixture autoregressive (MAR) models have been studied by Wong and Li (2000) and Wong (1998). This is a relatively simple class of models having the attractive property that the shape of the conditional distribution of a forecast depends on the recent history of the process. In particular, it may have a varying number of modes over time.

Wong and Li (2000) note that the multi-step conditional distributions of predictors in MAR models are intractable analytically and resort to Monte Carlo simulations. We show that the distributions of the multi-step predictors in MAR models are also mixtures and specify them analytically.

We also demonstrate that (conditional) characteristic functions are the natural tool for calculations in this type of model. Conditional means and variances are not sufficient for prediction (even when they exist) in the presence of severe deviation from normality. Natural alternatives to them are the conditional characteristic functions. A characteristic function contains the entire distributional information. If needed, conditional mean, variance or other moments can be obtained from it. Moreover, unlike variances and means, characteristic functions always exist and hence provide more general results. In the important class of α-stable distributions with 0<α<2, moments of order greater than or equal to α do not exist (see Zolotarev (1986)). Time series models based on stable distributions have been studied actively; see Samorodnitsky and Taqqu (1994), Rachev and Mittnik (2000) and the references therein. Also, it is difficult to obtain manageable forms of the probability densities of the stable distributions (except in some special cases). On the other hand, their characteristic functions have a remarkably simple form and thus are the natural tool to use. Some of the most efficient methods for simulation of α-stable distributions are based on their characteristic functions, not densities.

Section snippets

The MAR model

Let π=(π1,,πg) be a discrete distribution such that πk>0 for k=1,,g and k=1gπk=1. A process {y(t)} is said to be a mixture autoregressive process with g components if the conditional distribution function of y(t+1) given the information from the past of the process is a mixture of the following form:

Ft+1t(x)Pr(y(t+1)xFt)=k=1gπkFk(xϕk,0i=1pkϕk,iy(t+1i)σk), where for each k=1,,g, Fk is a distribution function, σk>0, and ϕk,i, i=0,1,,pk, are the autoregressive coefficients of the kth

Main results

Eq. (4) is well suited for one-step prediction since the random elements in it are either in Ft or are independent of it. For longer horizons Eq. (4) can be applied recursively to eliminate unobserved values of the process. The method is a natural extension of the similar procedure for autoregressive models. For example, for an AR(2) model we might get equations suitable for prediction two lags ahead by eliminating the unknown value at time t+1 as follows: x(t+2)=ϕ1x(t+1)+ϕ2x(t)+ε(t+2)=ϕ1(ϕ1x(t)

Example

Wong and Li (2000) built a mixture autoregressive model for the daily IBM stock closing price data (Box and Jenkins, 1976) with the following parameters: g=3; π=(0.5439,0.4176,0.0385); σ1=4.8227, σ2=6.0082, σ3=18.1716; p1=2, p2=2, p3=1; ϕk,0=0, k=1,2,3; ϕ1,1=0.6792, ϕ1,2=0.3208, ϕ2,1=1.6711, ϕ2,2=0.6711, ϕ3,1=1. Putting these parameters into (2) gives the one-step conditional density, ft+1t(x)=0.000845235e0.0015142(xyt)2+0.0449924e0.0214976(x0.6792yt0.3208yt1)2+0.0277285e0.013851(x

Conclusion

We have shown that the multi-step predictors for a mixture autoregressive model are mixtures. Moreover, when the noise components are normal or stable the predictors remain mixtures of normal, respectively stable, distributions for all horizons. We have also demonstrated that the conditional characteristic function is a useful and intuitive instrument for analysis of MAR models.

Acknowledgements

I thank an anonymous referee for the constructive comments and suggestions.

References (7)

  • G.E. Box et al.
  • E. Lukacs
  • S. Rachev et al.

    Stable Paretian Models in Finance

    (2000)
There are more references available in the full text version of this article.

Cited by (10)

  • Heavy-tailed mixture GARCH volatility modeling and Value-at-Risk estimation

    2013, Expert Systems with Applications
    Citation Excerpt :

    The research continues with explorations multi-step predictions (Boshnakov, 2009), and with investigations into whether the MT(2)-GARCH(1, 1) may capture better the conditional heteroskedasticity than the similar nonlinear Markov Regime-Switching MRS-GARCH in which the structural breaks in the parameters are also controlled through constant transition probabilities but with Markovian evolution.

  • Non-Gaussian Autoregressive-Type Time Series

    2022, Non-Gaussian Autoregressive-Type Time Series
View all citing articles on Scopus
View full text