Abstract
The standard efficient testing procedures in the Generalized Inverse Gaussian (GIG) family (also known as Halphen Type A family) are likelihood ratio tests, and hence rely on Maximum Likelihood (ML) estimation of the three parameters of the GIG. The particular form of GIG densities, involving modified Bessel functions, prevents in general form a closed-form expression for ML estimators, which are obtained at the expense of complex numerical approximation methods. On the contrary, Method of Moments (MM) estimators allow for concise expressions, but tests based on these estimators suffer from a lack of efficiency as compared to likelihood ratio tests. This is why, in recent years, trade-offs between ML and MM estimators have been proposed, resulting in simpler yet not completely efficient estimators and tests. In the present paper, we do not propose such a trade-off but rather an optimal combination of both methods, our tests inheriting efficiency from an ML-like construction and simplicity from the MM estimators of the nuisance parameters. This goal shall be reached by attacking the problem from a new angle, namely via the Le Cam methodology. Besides providing simple efficient testing methods, the theoretical background of this methodology further allows us to write out explicitly power expressions for our tests. A Monte Carlo simulation study shows that, also at small sample sizes, our simpler procedures do at least as good as the complex likelihood ratio tests. We conclude the paper by applying our findings on two real-data sets.
Similar content being viewed by others
References
Barndorff-Nielsen OE, Halgreen C (1977) Infinite divisibility of the Hyperbolic and generalized inverse Gaussian distribution. Z Wahrscheinlichkeitstheorie und Verw Gebiete 38:309–312
Bickel PJ, Klaassen CAJ, Ritov Y, Wellner JA (1993) Efficient and adaptive statistical inference for semiparametric models. Johns Hopkins University Press, Baltimore
Chebana F, El Adlouni S, Bobée B (2010) Mixed estimation methods for Halphen distributions with applications in extreme hydrologic events. Stoch Environ Res Risk Assess 24:359–376
Chhikara RS, Folks L (1989) The inverse Gaussian distribution: theory, methodology, and applications. Marcel Dekker, New York
Fitzgerald DL (2000) Statistical aspects of Tricomi’s function and modified Bessel functions of the second kind. Stoch Environ Res Risk Assess 14:139–158
Good IJ (1953) The population frequencies of species and the estimation of population parameters. Biometrika 40:237–264
Halphen E (1941) Sur un nouveau type de courbe de fréquence. Comptes Rendus de l’Académie des Sciences 213:633–635 (Published under the name of “Dugué” due to war constraints)
Iyengar S, Liao Q (1997) Modeling neural activity using the generalized inverse Gaussian distribution. Biol Cybern 77:289–295
Jørgensen B (1982) Statistical properties of the generalized inverse Gaussian distribution. Springer, Heidelberg
Kreiss J-P (1987) On adaptive estimation in stationary ARMA processes. Ann Stat 15:112–133
Le Cam L (1960) Locally asymptotically normal families of distribution. Univ Calif Publ Stat 3:37–98
Le Cam L, Yang GL (2000) Asymptotics in statistics, 2nd edn., Some basic conceptsSpringer, New York
Lemonte AJ, Cordeiro GM (2011) The exponentiated generalized inverse Gaussian distribution. Stat Probab Lett 81:506–517
Mudholkar SM, Tian L (2002) An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test. J Stat Plan Inference 102:211–221
Natarajan R, Mudholkar GS (2004) Moment-based goodness-of-fit tests for the inverse Gaussian distribution. Technometrics 46:339–347
Perreault L, Bobée B, Rasmussen PF (1999a) Halphen distribution system. I: Mathematical and statistical properties. J Hydrol Eng 4:189–199
Perreault L, Bobée B, Rasmussen PF (1999b) Halphen distribution system. II: Parameter and quantile estimation. J Hydrol Eng 4:200–208
Scaillet O (2004) Density estimation using inverse and reciprocal inverse Gaussian kernels. J Nonparametr Stat 16:217–226
Seshadri V (1993) The inverse Gaussian distribution: a case study in exponential families. Oxford University Press, Oxford
Seshadri V (1999) The inverse Gaussian distribution. Springer, New York
Tweedie MCK (1945) Inverse statistical variates. Nature 155:453–453
Tweedie MCK (1956) Statistical properties of inverse gaussian distributions. Va J Sci 7:160–165
Tweedie MCK (1957) Statistical properties of inverse gaussian distributions. Ann Math Stat 28:362–377
van der Vaart AW (1998) Asymptotic statistics. Cambridge Series in Statistical and Probabilistic Mathematics Cambridge University Press, Cambridge
Acknowledgments
The research of Christophe Ley is supported by a Mandat de Chargé de Recherche FNRS from the Fonds National de la Recherche Scientifique, Communauté française de Belgique. The authors thank an anonymous referee for helpful comments, and Ivan Nourdin for inviting Christophe Ley for a visit at the Institut Elie Cartan, research stay during which the present work was initiated and the main parts worked out.
Author information
Authors and Affiliations
Corresponding author
A Appendix
A Appendix
Proof of Theorem 2.1 Establishing the ULAN property of \(\mathrm{GIG}(p,a,b)\) with respect to all three parameters \(p,a,b\) is quite straightforward since we are not working within a semiparametric family of distributions (hence we do not have to deal with an infinite-dimensional parameter); the problem considered involves a parametric family of distributions with densities meeting the most classical regularity conditions. In particular, one readily obtains that (i) \((p,a,b)\mapsto \sqrt{c(p,a,b) x^{p-1}\mathrm{e}^{-(ax+b/x)/2}}\) is continuously differentiable for every \(x>0\) and (ii) the associated Fisher information matrix is well defined and continuous in \(p,a\) and \(b\). Thus, by Lemma 7.6 of van der Vaart (1998), \((p,a,b)\mapsto \sqrt{c(p,a,b) x^{p-1}\mathrm{e}^{-(ax+b/x)/2}}\) is differentiable in quadratic mean, and the ULAN property follows from Theorem 7.2 of van der Vaart (1998). This completes the proof. \(\square \)
Proof of Proposition 3.1 It follows from the asymptotic linearity property of the central sequence given in (3) that, under \(\mathrm{P}_{{\pmb \vartheta }}^{(n)}\) and for \(n\rightarrow \infty \),
where
is the projection matrix onto the subspace orthogonal to the central sequences for \(a\) and \(b\). One easily checks that the product \({\pmb P}({\pmb \vartheta }){\pmb \varGamma }({\pmb \vartheta })\) is of the form \((\cdot ,0,0)\), leading to the announced asymptotic linearity (4). The other asymptotic equality in probability follows by continuity. \(\square \)
Proof of Theorem 3.1 The statement in Part (i) is easily proved thanks to the asymptotic linearity in (6) under the null, since \(\varDelta ^{(n)\mathrm{eff}}_p(p_0,a,b)\) is asymptotically \(\mathcal {N}(0, \varGamma _{p,p}^{\mathrm{eff}}(p_0,a,b))\) under \(\bigcup _{a\in \mathbb R_0^+} \bigcup _{b\in \mathbb R_0^+}\mathrm{P}^{(n)}_{p_0,a,b}\) by the central limit theorem.
In order to prove the more delicate Part (ii), observe that, under \(\mathrm{P}^{(n)}_{p_0,a,b}\) and for any bounded sequence \({{\pmb \tau }}^{(n)}=(\tau _1^{(n)}, {\tau }_2^{(n)}, \tau _3^{(n)}) '\in \mathbb R^3\), we see that, as \(n\rightarrow \infty \),
where \(\Lambda ^{(n)}_{(p_0,a,b)'+n^{-1/2}{\pmb \tau }^{(n)}/(p_0,a,b)'}\) is the log-likelihood ratio and \({\pmb \tau }=\lim _{n \rightarrow \infty } {\pmb \tau }^{(n)}\). We can then apply Le Cam’s third lemma which implies that \(\varDelta _{p}^{(n)\mathrm{eff}}(p_0,a,b)\) is asymptotically \(\mathcal {N}\left( \tau _1\varGamma _{p,p}^{\mathrm{eff}}(p_0,a, b), \varGamma _{p,p}^{\mathrm{eff}}(p_0,a,b)\right) \) under \(\mathrm{P}^{(n)}_{p_0+n^{-1/2}\tau _1^{(n)},a,b}\). Since the asymptotic linearity (6) holds as well under \(\mathrm{P}^{(n)}_{p_0+n^{-1/2}\tau _1^{(n)},a,b}\) by contiguity, Part (ii) of the theorem readily follows.
As regards Part (iii), the fact that \(\phi ^{(n)}_{p_0}\) has asymptotic level \(\alpha \) follows directly from the asymptotic null distribution given in Part (i), while local asymptotic maximinity is a consequence of the convergence of the local experiments to the Gaussian shift experiment (see Le Cam and Yang 2000). \(\square \)
Rights and permissions
About this article
Cite this article
Koudou, A.E., Ley, C. Efficiency combined with simplicity: new testing procedures for Generalized Inverse Gaussian models. TEST 23, 708–724 (2014). https://doi.org/10.1007/s11749-014-0378-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11749-014-0378-2
Keywords
- Asymptotic linearity
- GIG distributions
- IG distributions
- Maximin tests
- Uniform local asymptotic normality