Abstract

Under some simple conditions, by using some techniques such as truncated method for random variables (see e.g., Gut (2005)) and properties of martingale differences, we studied the moving process based on martingale differences and obtained complete convergence and complete moment convergence for this moving process. Our results extend some related ones.

1. Introduction

Let be a doubly infinite sequence of random variables. Assume that is an absolutely summable sequence of real numbers and is the moving average process based on the sequence . As usual, , , denotes the sequence of partial sums.

For the moving average process , where is a sequence of independent identically distributed (i.i.d.) random variables, Ibragimov [1] established the central limit theorem, Burton and Dehling [2] obtained a large deviation principle, and Li et al. [3] gave the complete convergence result for . Zhang [4] and Li and Zhang [5] extended the complete convergence of moving average process for i.i.d. sequence to -mixing sequence and NA sequence, respectively. Theorems  A and  B are due to Zhang [4] and Kim et al. [6], respectively.

Theorem A. Suppose that is a sequence of identically distributed -mixing random variables with and is as in (1.1). Let be a slowly varying function and , . If and , then

Theorem B. Suppose that is a sequence of identically distributed -mixing random variables with , and and is as in (1.1). Let be a slowly varying function and , . If , then where .

Chen et al. [7] and Zhou [8] also studied the limit behavior of moving average process under -mixing assumption. Yang et al. [9] investigated the moving average process for AANA sequence. For more works on complete convergence, one can refer to [36, 1013] and the references therein.

Recall that the sequence is stochastically dominated by a nonnegative random variable if

Recently, Chen and Li [14] investigated the limit behavior of moving process under martingale difference sequences. They obtained the following theorems.

Theorem C. Let , and . Assume that is a moving average process defined in (1.1), where is a martingale difference related to an increasing sequence of -fields and stochastically dominated by a nonnegative random variable . If , then for every ,

Theorem D. Let , , and . Assume that is a moving average process defined in (1.1), where is a martingale difference related to an increasing sequence of -fields and stochastically dominated by a nonnegative random variable . If then for every , where when and when and .

Inspired by Chen and Li [14], Chen et al. [7], Sung [13] and other papers above, we go on to investigate the limit behavior of moving process under martingale difference sequence and obtain some similar results of Theorems  C and  D, but we only need some simple conditions. Our results extend some results of Chen and Li [14] (see Remark 3.3 in Section 3). Two lemmas and two theorems are given in Sections 2 and 3, respectively. The proofs of theorems are presented in Section 4.

For various results of martingales, one can refer to Chow [15], Hall and Heyde [16], Yu [17], Ghosal and Chandra [18], and so forth. As an application of moving average process based on martingale differences, we can refer to [1922] and the references therein. Throughout the paper, is the indicator function of set , and , , denote some positive constants not depending on , which may be different in various places.

2. Two Lemmas

The following lemmas are our basic techniques to prove our results.

Lemma 2.1 (cf. Hall and Heyde [16, Theorem  2.11]). If is a martingale difference and , then there exists a constant depending only on such that

Lemma 2.2 (cf. Wu [23, Lemma  4.1.6]). Let be a sequence of random variables, which is stochastically dominated by a nonnegative random variable . Then for any and , the following two statements hold: where and are positive constants.

3. Main Results

Theorem 3.1. Let and . Assume that is a moving average processes defined in (1.1), where is a martingale difference related to an increasing sequence of -fields and stochastically dominated by a nonnegative random variable . Let be a constant. Suppose that for and almost surely (a.s.), if . Then for every ,

Theorem 3.2. Let the conditions of Theorem 3.1 hold. Then for every ,

Remark 3.3. Let be an increasing family of -algebras and be a sequence of martingale differences. Assume that for some , where is a constant not depending on , and other conditions are satisfied, Yu [17] investigated the complete convergence of weighted sums of martingale differences. On the other hand, under the condition and other conditions, Ghosal and Chandra [18] obtained the complete convergence of martingale arrays. Thus, if , our assumption , ., is reasonable. Chen and Li [14] obtained Theorems  C and  D for the case . We go on to investigate this moving average process for the case , especially for the case and get the results of (3.1)–(3.4). If for , and , result (3.1) follows from Theorem  C (see Theorem  1.1 of Chen and Li), but we can obtain results (3.1) and (3.2) under weaker condition . On the other hand, comparing with the conditions of Theorem  D, our conditions of Theorem 3.2 are relatively simple.

4. The Proofs of Main Results

Proof of Theorem 3.1. First, we show that the moving average process (1.1) converges a.s. under the conditions of Theorem 3.1. Since , it has , following from . On the other hand, applying Lemma 2.2 with and , one has Consequently, we have by that which implies converges a.s.
Note that Let Since , we can see that
For , by Markov’s inequality, Lemma 2.2, and , one has Meanwhile, by the martingale property, Lemma 2.2 and the proof of (4.6), it follows that Obviously, one can find that is a martingale difference. So, by Markov’s inequality, Hölder’s inequality, and Lemma 2.1, we get that for any , If , then we take large enough such that . From , a.s. and Jensen’s inequality for conditional expectation, we have , a.s. On the other hand, Consequently, we obtain by that following from the fact that . Meanwhile, by inequality, Lemma 2.2 and , Since and , one has By the proof of (4.6), If , then we take . Similar to the proofs of (4.8), and (4.11), it has following from , (4.12), and (4.13). Therefore, (3.1) follows from (4.5)–(4.13) and the inequality above.
Inspired by the proof of Theorem  12.1 of Gut [24], it can be checked that If , then Otherwise, Combining (3.1) with these inequalities above, we obtain (3.2) immediately.

Proof of Theorem 3.2. For all , it has By Theorem 3.1, in order to proof (3.3), we only have to show that For , denote Since , it is easy to see that By Markov’s inequality, Lemma 2.2 and , Since is a martingale difference, we have by Markov’s inequality, Hölder’s inequality, and Lemma 2.1 that for any , If , then we take large enough such that . By , a.s. and Jensen’s inequality for conditional expectation, it has ,   a.s.. Meanwhile, Thus, by , one has that following from the fact that . We also have by inequality and Lemma 2.2 that Since and , it follows that From the proof of (4.22), If , then we take . Similar to the proofs of (4.23) and (4.26), we get that following from , (4.27) and (4.28). Consequently, by (4.18)–(4.28), Theorem 3.1 and inequality above, (3.3) holds true.
Now, we turn to prove (3.4). Similar to the proof of (3.2), we have that If , then Otherwise,
Therefore, (3.4) holds true following from (3.3).

Acknowledgments

The authors are grateful to Editor Chuanxi Qian and an anonymous referee for their careful reading and insightful comments. This work is supported by the National Natural Science Foundation of China (11171001, 11126176), HSSPF of the Ministry of Education of China (10YJA910005), Natural Science Foundation of Anhui Province (1208085QA03), and Provincial Natural Science Research Project of Anhui Colleges (KJ2010A005).