In this section, we extend the corresponding results in Nakata [5] from the case of i.i.d. random variables to pairwise NQD random variables.
2.1 Main results
We state our weak law of large numbers for different weighted sums of pairwise NQD random variables.
Theorem 2.1
Let
\(\{X_{i}, i\ge 1\}\)
be a sequence of pairwise NQD random variables whose distributions satisfy
$$ \mathbb {P}\bigl(\vert X_{j} \vert >x\bigr)\asymp x^{-\alpha } \quad \textit{for } j\ge 1 $$
and
$$ \limsup_{x\to \infty } \sup_{j\ge 1}x^{\alpha }\mathbb {P}\bigl(\vert X_{j} \vert >x\bigr)< \infty . $$
If there exist two positive sequences
\(\{a_{j}\}\)
and
\(\{b_{j}\}\)
satisfying
$$ \sum_{j=1}^{n} a_{j}^{\alpha }=o\bigl(b_{n}^{\alpha }\bigr), $$
(2.1)
then it follows that
$$ \lim_{n\to \infty }\frac{1}{b_{n}}\sum _{j=1}^{n} a_{j} \biggl( X_{j}- \mathbb {E}X _{j}1 \biggl\{ \vert X_{j} \vert \le \frac{b_{n}}{a_{j}} \biggr\} \biggr) =0 \quad \textit{in probability}. $$
(2.2)
In particular, if there exists a constant
A
such that
$$ \lim_{n\to \infty }\frac{1}{b_{n}}\sum_{j=1}^{n} a_{j}\mathbb {E}X_{j}1 \biggl\{ \vert X_{j} \vert \le \frac{b_{n}}{a_{j}} \biggr\} =A, $$
then we have
$$ \lim_{n\to \infty }\frac{1}{b_{n}}\sum_{j=1}^{n} a_{j} X_{j}=A \quad \textit{in probability}. $$
From Theorem 2.1 and by using the same methods as in Nakata [5] to calculate the constant A, we can obtain the following four corollaries for the pairwise NQD random variables.
Corollary 2.1
Under the assumptions of Theorem
2.1, if
\(0<\alpha <1\), then we have
$$ \lim_{n\to \infty }\frac{1}{b_{n}}\sum_{j=1}^{n} a_{j} X_{j}=0 \quad \textit{in probability}. $$
Corollary 2.2
Let
\(\{X_{i}, i\ge 1\}\)
be a sequence of nonnegative pairwise NQD random variables whose distributions satisfy
$$ \mathbb {P}\bigl(\vert X_{j} \vert >x\bigr)= (x+q_{j})^{-1} \quad \textit{for } x\ge 0. $$
If
\(q_{j}\ge 1\)
for any positive integer
j,and
$$ Q_{n}:=\sum_{j=1}^{n}q_{j}^{-1} \to \infty , \quad \textit{as } n\to \infty . $$
(2.3)
Then we have
$$ \lim_{n\to \infty }\frac{\sum_{j=1}^{n} q_{j}^{-1}X_{j}}{Q_{n}\log Q _{n}}=1 \quad \textit{in probability}. $$
Corollary 2.3
Let us suppose the assumptions of Corollary
2.2
and
\(q_{j}=j\)
in Eq. (2.3). Then, for any
\(\gamma >-1\)
and real
δ, we have
$$ \lim_{n\to \infty }\frac{\sum_{j=1}^{n} j^{-1}(\log j)^{\gamma }( \log \log j)^{\delta }X_{j}}{(\log n)^{\gamma +1}(\log \log n)^{ \delta +1}}=\frac{1}{\gamma +1} \quad \textit{in probability}. $$
Corollary 2.4
Let
\(\{X_{i}, i\ge 1\}\)
be a sequence of pairwise NQD random variables whose common distribution satisfies (1.4) with
\(\alpha =1\). If there exists a real
Q
such that
$$ \lim_{x\to \infty }\frac{\mathbb {E}X1(\vert X \vert \le x)}{\log x}=Q. $$
Then, for each real
\(\beta >-1\)
and slowly varying sequence
\(l(n)\), it follows that
$$ \lim_{n\to \infty }\frac{\sum_{j=1}^{n} j^{\beta }l(j)X_{j}}{n^{ \beta +1}l(n)\log n}=\frac{Q}{1+\beta } \quad \textit{in probability}. $$
Theorem 2.2
Let
\(\{X_{i}, i\ge 1\}\)
be a sequence of pairwise NQD random variables whose distributions satisfy
$$ \mathbb {P}\bigl(\vert X_{j} \vert >x\bigr)\asymp x^{-\alpha } \quad \textit{for } j\ge 1 $$
and
$$ \limsup_{x\to \infty } \sup_{j\ge 1}x^{\alpha }\mathbb {P}\bigl(\vert X_{j} \vert >x\bigr)< \infty . $$
If there exist two positive sequences
\(\{a_{j}\}\)
and
\(\{b_{j}\}\)
satisfying
$$ \bigl( \log^{2} n \bigr) \sum _{j=1}^{n} a_{j}^{\alpha }=o \bigl(b_{n}^{\alpha }\bigr). $$
(2.4)
Then it follows that
$$ \lim_{n\to \infty }\frac{1}{b_{n}}\max _{1\le k\le n}\Biggl\vert \sum_{j=1} ^{k} a_{j} \biggl( X_{j}-\mathbb {E}X_{j}1 \biggl\{ \vert X_{j} \vert \le \frac{b_{n}}{a_{j}} \biggr\} \biggr) \Biggr\vert =0 \quad \textit{in probability}. $$
(2.5)
Corollary 2.5
Under the assumptions of Theorem
2.2, if
\(0<\alpha <1\), then we have
$$ \lim_{n\to \infty }\frac{1}{b_{n}}\max_{1\le k\le n}\Biggl\vert \sum_{j=1} ^{k} a_{j} X_{j}\Biggr\vert =0 \quad \textit{in probability}. $$
Remark 2.1
If \(\{X_{i}, i\ge 1\}\) is a sequence of NA random variables satisfying the assumptions of Theorem 2.2, then from the maximal inequality of NA random variables (see [8, Theorem 2]), the condition (2.4) can be weakened by (2.1).
2.2 Proofs of Theorem 2.1 and Theorem 2.2
We first give some useful lemmas.
Lemma 2.1
([5])
If a random variable
X
satisfies (1.4), then it follows that
$$ \mathbb {E}\bigl(\vert X \vert 1\bigl(\vert X \vert \le x\bigr)\bigr)\asymp \textstyle\begin{cases} x^{1-\alpha },& \textit{if } 0< \alpha < 1, \\ \log x,& \textit{if } \alpha =1, \end{cases} $$
and
$$ \mathbb {E}\bigl(\vert X \vert ^{2}1\bigl(\vert X \vert \le x\bigr) \bigr)\asymp x^{2-\alpha } \quad \textit{for } 0< \alpha \le 1. $$
Lemma 2.2
([6])
Let
\(\{X_{n},n\ge 1\}\)
be a sequence of pairwise NQD random variables. Let
\(\{f_{n}, n\ge 1\}\)
be a sequence of increasing functions. Then
\(\{f_{n}(X_{n}), n\ge 1\}\)
is a sequence of pairwise NQD random variables.
Lemma 2.3
([9])
Let
\(\{X_{n},n\ge 1\}\)
be a sequence of pairwise NQD random variables with mean zero and
\(\mathbb {E}X_{n}^{2}<\infty \), and
\(T_{j}(k)= \sum_{i=j+1}^{j+k} X_{i}\), \(j\ge 0\). Then
$$ \mathbb {E}\bigl(T_{j}(k)\bigr)^{2}\le C\sum _{i=j+1}^{j+k} \mathbb {E}X_{i}^{2}, \qquad \mathbb {E}\max_{1\le k\le n} \bigl(T_{j}(k)\bigr)^{2}\le C\log^{2} n\sum_{i=j+1}^{j+n} \mathbb {E}X _{i}^{2}. $$
Proof of Theorem 2.1
For any \(1\le i\le n\), let us define
$$ Y_{ni}=-b_{n}1(a_{i}X_{i}< -b_{n})+a_{i}X_{i}1 \bigl(a_{i}\vert X_{i} \vert \le b_{n}\bigr)+b _{n}1(a_{i}X_{i}>b_{n}) $$
and
$$ Z_{ni}=(a_{i}X_{i}+b_{n})1(a_{i}X_{i}< -b_{n})+(a_{i}X_{i}-b_{n})1(a _{i}X_{i}>b_{n}). $$
Then from Lemma 2.2 it follows that \(\{Y_{ni}, 1\le i\le n, n \ge 1\}\) and \(\{Z_{ni}, 1\le i\le n, n\ge 1\}\) are both pairwise NQD, and
$$ \sum_{i=1}^{n} a_{i} X_{i}=\sum_{i=1}^{n} (Y_{ni}+Z_{ni}). $$
Furthermore, let us define \(X_{ni}= X_{i}1(a_{i}\vert X_{i} \vert \le b_{n})\), then the limit (2.2) holds if we show
$$\begin{aligned}& \lim_{n\to \infty } \frac{1}{b_{n}}\sum _{i=1}^{n} (Y_{ni}-\mathbb {E}Y_{ni})=0 \quad \text{in probability}, \end{aligned}$$
(2.6)
$$\begin{aligned}& \lim_{n\to \infty } \frac{1}{b_{n}}\sum _{i=1}^{n} (a_{i}X_{i}- Y_{ni})=0 \quad \text{in probability}, \end{aligned}$$
(2.7)
and
$$ \lim_{n\to \infty } \frac{1}{b_{n}}\sum _{i=1}^{n} (a_{i}\mathbb {E}X_{ni}- \mathbb {E}Y _{ni})=0. $$
(2.8)
Using the proof of Lemma 2.2 in [4], we get
$$ \frac{1}{b^{2}_{n}}\sum_{i=1}^{n} a_{i}^{2}\mathbb {E}\bigl[ X_{i}^{2}1\bigl(a _{i}\vert X_{i} \vert \le b_{n}\bigr) \bigr] \to 0 $$
and
$$ \sum_{i=1}^{n} \mathbb {P}\bigl(a_{i} \vert X_{i} \vert \ge b_{n}\bigr)\to 0. $$
From Lemma 2.1 and Lemma 2.3, we have
$$\begin{aligned} \frac{1}{b^{2}_{n}}\operatorname{Var} \Biggl( \sum_{i=1}^{n}(Y_{ni}-\mathbb {E}Y_{ni}) \Biggr) &\le \frac{C}{b^{2}_{n}}\sum _{i=1}^{n}\mathbb {E}Y_{ni}^{2} \\ & \le \frac{C}{b ^{2}_{n}}\sum_{i=1}^{n} a_{i}^{2}\mathbb {E}\bigl[ X_{i}^{2}1 \bigl(a_{i}\vert X_{i} \vert \le b_{n}\bigr) \bigr] \\ &\quad {}+C\sum_{i=1}^{n} \mathbb {P}\bigl(a_{i}\vert X_{i} \vert \ge b_{n}\bigr) \to 0, \end{aligned}$$
which implies (2.6). Similarly, for any \(r>0\), we have
$$\begin{aligned} \mathbb {P}\Biggl( \frac{1}{b_{n}}\Biggl\vert \sum_{i=1}^{n} Z_{ni}\Biggr\vert >r \Biggr)&\le \mathbb {P}\Biggl( \bigcup_{i=1}^{n} \bigl\{ a_{i}\vert X_{i} \vert >b_{n}\bigr\} \Biggr) \\ &\le \sum_{i=1}^{n} \mathbb {P}\bigl( a_{i}\vert X_{i} \vert >b_{n} \bigr) \to 0, \end{aligned}$$
which yields (2.7). Finally, (2.8) holds since
$$ \frac{1}{b_{n}}\Biggl\vert \sum_{i=1}^{n} (a_{i}\mathbb {E}X_{ni}- \mathbb {E}Y_{ni})\Biggr\vert \le C \sum_{i=1}^{n}\mathbb {P}\bigl(a_{i} \vert X_{i} \vert >b_{n}\bigr)\to 0. $$
Based on the above discussions, the desired results are obtained. □
Proof of Theorem 2.2
By a proof similar to that of Theorem 2.1, it is enough to show
$$\begin{aligned}& \lim_{n\to \infty } \frac{1}{b_{n}}\max _{1\le k\le n}\Biggl\vert \sum_{i=1} ^{k} (Y_{ni}-\mathbb {E}Y_{ni})\Biggr\vert =0 \quad \text{in probability}, \end{aligned}$$
(2.9)
$$\begin{aligned}& \lim_{n\to \infty } \frac{1}{b_{n}}\max _{1\le k\le n}\Biggl\vert \sum_{i=1} ^{k} (a_{i}X_{i}- Y_{ni})\Biggr\vert =0 \quad \text{in probability}, \end{aligned}$$
(2.10)
and
$$ \lim_{n\to \infty } \frac{1}{b_{n}}\max _{1\le k\le n}\Biggl\vert \sum_{i=1} ^{k} (a_{i}\mathbb {E}X_{ni}- \mathbb {E}Y_{ni}) \Biggr\vert =0. $$
(2.11)
From Lemma 2.1 and Lemma 2.3, we have
$$\begin{aligned} &\frac{1}{b^{2}_{n}}\mathbb {E}\Biggl( \max_{1\le k\le n}\Biggl\vert \sum_{i=1}^{k} (Y _{ni}-\mathbb {E}Y_{ni})\Biggr\vert ^{2} \Biggr) \\ &\quad \le \frac{C\log^{2} n}{b^{2}_{n}} \sum_{i=1}^{n}\mathbb {E}Y_{ni}^{2} \\ &\quad \le \frac{C\log^{2} n}{b^{2}_{n}} \sum_{i=1}^{n} a_{i}^{2}\mathbb {E}\bigl[ X_{i}^{2}1 \bigl(a_{i}\vert X_{i} \vert \le b_{n}\bigr) \bigr] +C \log^{2} n\sum_{i=1}^{n} \mathbb {P}\bigl(a_{i}\vert X_{i} \vert \ge b_{n} \bigr) \\ &\quad \le \frac{C \log^{2} n}{b^{2}_{n}}\sum_{i=1}^{n} a_{i}^{2}(b_{n}/a_{i})^{2-\alpha }+ \frac{C\log^{2} n}{b_{n}^{\alpha }}\sum_{i=1}^{n} a_{i}^{\alpha } \to 0, \end{aligned}$$
which implies (2.9). Similarly, for any \(r>0\), we have
$$ \begin{aligned}\mathbb {P}\Biggl( \frac{1}{b_{n}}\max_{1\le k\le n}\Biggl\vert \sum_{i=1}^{k} Z_{ni}\Biggr\vert >r \Biggr) \le \mathbb {P}\Biggl( \bigcup_{i=1}^{n} \bigl\{ a_{i}\vert X_{i} \vert >b_{n}\bigr\} \Biggr) \le \sum_{i=1}^{n} \mathbb {P}\bigl( a_{i}\vert X_{i} \vert >b_{n} \bigr) \to 0,\end{aligned}$$
which yields (2.10). At last
$$ \frac{1}{b_{n}}\max_{1\le k\le n}\Biggl\vert \sum _{i=1}^{k} (a_{i}\mathbb {E}X_{ni}- \mathbb {E}Y_{ni})\Biggr\vert \le C\sum_{i=1}^{n} \mathbb {P}\bigl(a_{i}\vert X_{i} \vert >b_{n}\bigr) \to 0. $$
Based on the above discussions, the desired result is obtained. □