A matrix extension of the Cauchy-Schwarz inequality

https://doi.org/10.1016/S0165-1765(99)00014-2Get rights and content

Abstract

A simple argument is used to obtain a very useful generalization of the well known Cauchy-Schwarz inequality.

Section snippets

Main result

Notation 1.

Let A and B be two p×p matrices. We write AB if and only if BA is non-negative definite. ‖A‖ denotes the Euclidean norm of a matrix; i.e. ‖A‖=1≤i,j≤pai,j2. The transpose of A is given by A′. 

Using this notation we prove the following, very useful, inequality.

Theorem 1.

Let xRp and yRq be random vectors such that Ex2<∞, Ey2<∞, and Eyyis non-singular. Then,(Exy′(Eyy′))−1(Eyx′)≤Exx′.

Example 1.1.

(A Simple Application). As an application of the above inequality consider the regression model y=θx+γz+ε with

Conclusion

This inequality was obtained in response to a question asked by some students while I was teaching a graduate course in Econometrics. I have found it to be a handy little tool while looking at issues related to asymptotic efficiency of estimators. Although this inequality looks astonishingly familiar, I have been unable to discover any references to it in the literature that I have reviewed.

References (1)

  • P. Robinson

    Root-N-Consistent Semiparametric Regression

    Econometrica.

    (1988)

Cited by (52)

View all citing articles on Scopus
View full text