Skip to main content
Log in

Efficiency combined with simplicity: new testing procedures for Generalized Inverse Gaussian models

  • Original Paper
  • Published:
TEST Aims and scope Submit manuscript

Abstract

The standard efficient testing procedures in the Generalized Inverse Gaussian (GIG) family (also known as Halphen Type A family) are likelihood ratio tests, and hence rely on Maximum Likelihood (ML) estimation of the three parameters of the GIG. The particular form of GIG densities, involving modified Bessel functions, prevents in general form a closed-form expression for ML estimators, which are obtained at the expense of complex numerical approximation methods. On the contrary, Method of Moments (MM) estimators allow for concise expressions, but tests based on these estimators suffer from a lack of efficiency as compared to likelihood ratio tests. This is why, in recent years, trade-offs between ML and MM estimators have been proposed, resulting in simpler yet not completely efficient estimators and tests. In the present paper, we do not propose such a trade-off but rather an optimal combination of both methods, our tests inheriting efficiency from an ML-like construction and simplicity from the MM estimators of the nuisance parameters. This goal shall be reached by attacking the problem from a new angle, namely via the Le Cam methodology. Besides providing simple efficient testing methods, the theoretical background of this methodology further allows us to write out explicitly power expressions for our tests. A Monte Carlo simulation study shows that, also at small sample sizes, our simpler procedures do at least as good as the complex likelihood ratio tests. We conclude the paper by applying our findings on two real-data sets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Barndorff-Nielsen OE, Halgreen C (1977) Infinite divisibility of the Hyperbolic and generalized inverse Gaussian distribution. Z Wahrscheinlichkeitstheorie und Verw Gebiete 38:309–312

    Article  MathSciNet  Google Scholar 

  • Bickel PJ, Klaassen CAJ, Ritov Y, Wellner JA (1993) Efficient and adaptive statistical inference for semiparametric models. Johns Hopkins University Press, Baltimore

    Google Scholar 

  • Chebana F, El Adlouni S, Bobée B (2010) Mixed estimation methods for Halphen distributions with applications in extreme hydrologic events. Stoch Environ Res Risk Assess 24:359–376

    Article  Google Scholar 

  • Chhikara RS, Folks L (1989) The inverse Gaussian distribution: theory, methodology, and applications. Marcel Dekker, New York

    MATH  Google Scholar 

  • Fitzgerald DL (2000) Statistical aspects of Tricomi’s function and modified Bessel functions of the second kind. Stoch Environ Res Risk Assess 14:139–158

    Article  MATH  Google Scholar 

  • Good IJ (1953) The population frequencies of species and the estimation of population parameters. Biometrika 40:237–264

    Article  MATH  MathSciNet  Google Scholar 

  • Halphen E (1941) Sur un nouveau type de courbe de fréquence. Comptes Rendus de l’Académie des Sciences 213:633–635 (Published under the name of “Dugué” due to war constraints)

    Google Scholar 

  • Iyengar S, Liao Q (1997) Modeling neural activity using the generalized inverse Gaussian distribution. Biol Cybern 77:289–295

    Article  MATH  Google Scholar 

  • Jørgensen B (1982) Statistical properties of the generalized inverse Gaussian distribution. Springer, Heidelberg

    Book  Google Scholar 

  • Kreiss J-P (1987) On adaptive estimation in stationary ARMA processes. Ann Stat 15:112–133

    Article  MATH  MathSciNet  Google Scholar 

  • Le Cam L (1960) Locally asymptotically normal families of distribution. Univ Calif Publ Stat 3:37–98

    Google Scholar 

  • Le Cam L, Yang GL (2000) Asymptotics in statistics, 2nd edn., Some basic conceptsSpringer, New York

    Book  MATH  Google Scholar 

  • Lemonte AJ, Cordeiro GM (2011) The exponentiated generalized inverse Gaussian distribution. Stat Probab Lett 81:506–517

    Article  MATH  MathSciNet  Google Scholar 

  • Mudholkar SM, Tian L (2002) An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test. J Stat Plan Inference 102:211–221

    Article  MATH  MathSciNet  Google Scholar 

  • Natarajan R, Mudholkar GS (2004) Moment-based goodness-of-fit tests for the inverse Gaussian distribution. Technometrics 46:339–347

    Article  MathSciNet  Google Scholar 

  • Perreault L, Bobée B, Rasmussen PF (1999a) Halphen distribution system. I: Mathematical and statistical properties. J Hydrol Eng 4:189–199

    Article  Google Scholar 

  • Perreault L, Bobée B, Rasmussen PF (1999b) Halphen distribution system. II: Parameter and quantile estimation. J Hydrol Eng 4:200–208

    Article  Google Scholar 

  • Scaillet O (2004) Density estimation using inverse and reciprocal inverse Gaussian kernels. J Nonparametr Stat 16:217–226

    Article  MATH  MathSciNet  Google Scholar 

  • Seshadri V (1993) The inverse Gaussian distribution: a case study in exponential families. Oxford University Press, Oxford

    Google Scholar 

  • Seshadri V (1999) The inverse Gaussian distribution. Springer, New York

    Book  MATH  Google Scholar 

  • Tweedie MCK (1945) Inverse statistical variates. Nature 155:453–453

    Article  MATH  MathSciNet  Google Scholar 

  • Tweedie MCK (1956) Statistical properties of inverse gaussian distributions. Va J Sci 7:160–165

    MathSciNet  Google Scholar 

  • Tweedie MCK (1957) Statistical properties of inverse gaussian distributions. Ann Math Stat 28:362–377

    Article  MATH  MathSciNet  Google Scholar 

  • van der Vaart AW (1998) Asymptotic statistics. Cambridge Series in Statistical and Probabilistic Mathematics Cambridge University Press, Cambridge

Download references

Acknowledgments

The research of Christophe Ley is supported by a Mandat de Chargé de Recherche FNRS from the Fonds National de la Recherche Scientifique, Communauté française de Belgique. The authors thank an anonymous referee for helpful comments, and Ivan Nourdin for inviting Christophe Ley for a visit at the Institut Elie Cartan, research stay during which the present work was initiated and the main parts worked out.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christophe Ley.

A Appendix

A Appendix

Proof of Theorem 2.1 Establishing the ULAN property of \(\mathrm{GIG}(p,a,b)\) with respect to all three parameters \(p,a,b\) is quite straightforward since we are not working within a semiparametric family of distributions (hence we do not have to deal with an infinite-dimensional parameter); the problem considered involves a parametric family of distributions with densities meeting the most classical regularity conditions. In particular, one readily obtains that (i) \((p,a,b)\mapsto \sqrt{c(p,a,b) x^{p-1}\mathrm{e}^{-(ax+b/x)/2}}\) is continuously differentiable for every \(x>0\) and (ii) the associated Fisher information matrix is well defined and continuous in \(p,a\) and \(b\). Thus, by Lemma 7.6 of van der Vaart (1998), \((p,a,b)\mapsto \sqrt{c(p,a,b) x^{p-1}\mathrm{e}^{-(ax+b/x)/2}}\) is differentiable in quadratic mean, and the ULAN property follows from Theorem 7.2 of van der Vaart (1998). This completes the proof. \(\square \)

Proof of Proposition 3.1  It follows from the asymptotic linearity property of the central sequence given in (3) that, under \(\mathrm{P}_{{\pmb \vartheta }}^{(n)}\) and for \(n\rightarrow \infty \),

$$\begin{aligned} {{\pmb \varDelta }}^{(n)\mathrm{eff}}_p\left( {\pmb \vartheta }+n^{-1/2} (0,\tau _2^{(n)},\tau _3^{(n)})'\right) ={{\pmb \varDelta }}^{(n)\mathrm{eff}}_p({\pmb \vartheta })-{\pmb P}({\pmb \vartheta }){\pmb \varGamma }({\pmb \vartheta }) \left( \begin{array}{cll} 0\\ \tau _2^{(n)}\\ \tau _3^{(n)} \end{array} \right) +o_\mathrm{{P}}(1), \end{aligned}$$

where

$$\begin{aligned} {\pmb P}({\pmb \vartheta })&= \left( 1, -(\varGamma _{p,a}({\pmb \vartheta }),\varGamma _{p,b}({\pmb \vartheta }))\left( \begin{array}{ll} \varGamma _{b,b}({\pmb \vartheta })&{}\quad -\varGamma _{a,b}({\pmb \vartheta })\\ -\varGamma _{a,b}({\pmb \vartheta })&{}\quad \varGamma _{a,a}({\pmb \vartheta }) \end{array}\right) \right. \\&\left. \quad /(\varGamma _{b,b}({\pmb \vartheta })\varGamma _{a,a}({\pmb \vartheta })-(\varGamma _{a,b}({\pmb \vartheta }))^2)\right) \end{aligned}$$

is the projection matrix onto the subspace orthogonal to the central sequences for \(a\) and \(b\). One easily checks that the product \({\pmb P}({\pmb \vartheta }){\pmb \varGamma }({\pmb \vartheta })\) is of the form \((\cdot ,0,0)\), leading to the announced asymptotic linearity (4). The other asymptotic equality in probability follows by continuity. \(\square \)

Proof of Theorem 3.1 The statement in Part (i) is easily proved thanks to the asymptotic linearity in (6) under the null, since \(\varDelta ^{(n)\mathrm{eff}}_p(p_0,a,b)\) is asymptotically \(\mathcal {N}(0, \varGamma _{p,p}^{\mathrm{eff}}(p_0,a,b))\) under \(\bigcup _{a\in \mathbb R_0^+} \bigcup _{b\in \mathbb R_0^+}\mathrm{P}^{(n)}_{p_0,a,b}\) by the central limit theorem.

In order to prove the more delicate Part (ii), observe that, under \(\mathrm{P}^{(n)}_{p_0,a,b}\) and for any bounded sequence \({{\pmb \tau }}^{(n)}=(\tau _1^{(n)}, {\tau }_2^{(n)}, \tau _3^{(n)}) '\in \mathbb R^3\), we see that, as \(n\rightarrow \infty \),

$$\begin{aligned}&\begin{pmatrix} \varDelta _p^{(n)\mathrm{eff}}(p_0,a,b) \\ \Lambda ^{(n)}_{(p_0,a,b)'+n^{-1/2}{\pmb \tau }^{(n)}/(p_0,a,b)'} \end{pmatrix}\\&\quad \mathop {\longrightarrow }\limits ^{\mathcal {L}}\mathcal {N}_2 \left( \begin{pmatrix} 0 \\ -\frac{1}{2} {\pmb \tau }'{\pmb \varGamma }({\pmb \vartheta }){\pmb \tau }\end{pmatrix}, \begin{pmatrix} \varGamma _{p,p}^{\mathrm{eff}}(p_0,a,b) &{}\quad \tau _1\varGamma _{p,p}^{\mathrm{eff}}(p_0,a,b)\\ \tau _1 \varGamma _{p,p}^{\mathrm{eff}}(p_0,a,b) &{}\quad {\pmb \tau }'{\pmb \varGamma }({\pmb \vartheta })\tau \end{pmatrix}\right) , \end{aligned}$$

where \(\Lambda ^{(n)}_{(p_0,a,b)'+n^{-1/2}{\pmb \tau }^{(n)}/(p_0,a,b)'}\) is the log-likelihood ratio and \({\pmb \tau }=\lim _{n \rightarrow \infty } {\pmb \tau }^{(n)}\). We can then apply Le Cam’s third lemma which implies that \(\varDelta _{p}^{(n)\mathrm{eff}}(p_0,a,b)\) is asymptotically \(\mathcal {N}\left( \tau _1\varGamma _{p,p}^{\mathrm{eff}}(p_0,a, b), \varGamma _{p,p}^{\mathrm{eff}}(p_0,a,b)\right) \) under \(\mathrm{P}^{(n)}_{p_0+n^{-1/2}\tau _1^{(n)},a,b}\). Since the asymptotic linearity (6) holds as well under \(\mathrm{P}^{(n)}_{p_0+n^{-1/2}\tau _1^{(n)},a,b}\) by contiguity, Part (ii) of the theorem readily follows.

As regards Part (iii), the fact that \(\phi ^{(n)}_{p_0}\) has asymptotic level \(\alpha \) follows directly from the asymptotic null distribution given in Part (i), while local asymptotic maximinity is a consequence of the convergence of the local experiments to the Gaussian shift experiment (see Le Cam and Yang 2000). \(\square \)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Koudou, A.E., Ley, C. Efficiency combined with simplicity: new testing procedures for Generalized Inverse Gaussian models. TEST 23, 708–724 (2014). https://doi.org/10.1007/s11749-014-0378-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11749-014-0378-2

Keywords

Mathematics Subject Classification (2000)

Navigation