Skip to main content

Almost sure central limit theorem for products of sums of partial sums

Abstract

Considering a sequence of i.i.d. positive random variables, for products of sums of partial sums we establish an almost sure central limit theorem, which holds for some class of unbounded measurable functions.

1 Introduction and main results

Let \(\{X_{n};n\geq1\}\) be a sequence of random variables and define \(S_{n}=\sum_{i=1}^{n} X_{i}\). Some results as regards the limit theorem of products \(\prod_{j=1}^{n}S_{j}\) were obtained in recent years. Rempala and Wesolowski [1] obtained the following asymptotics for products of sums for a sequence of i.i.d. random variables.

Theorem A

Let \(\{X_{n};n\geq1\}\) be a sequence of i.i.d. positive square integrable random variables with \(\mathbb{E}X_{1}=\mu\), the coefficient of variation \(\gamma=\sigma/\mu\), where \(\sigma ^{2}=\operatorname{Var}(X_{1})\). Then

$$ \biggl(\frac{\prod_{k=1}^{n} S_{k}}{n!\mu^{n}} \biggr)^{\frac{1}{\gamma \sqrt{n}}}\stackrel{d}{\rightarrow} {e}^{\sqrt{2}\mathcal{N}} \quad \textit{as } n\to\infty. $$
(1.1)

Here and in the sequel, \(\mathcal{N}\) is a standard normal random variable and \(\stackrel{d}{\rightarrow}\) denotes the convergence in distribution.

Gonchigdanzan and Rempala [2] discussed the almost sure central limit theorem (ASCLT) for the products of partial sums and obtained the following result.

Theorem B

Let \(\{X_{n};n\geq1\}\) be a sequence of i.i.d. positive random variables with \(\mathbb{E}X_{1}=\mu\), \(\operatorname {Var}(X_{1})=\sigma^{2}\) the coefficient of variation \(\gamma=\sigma/\mu \). Then

$$ \lim_{N \to\infty}\frac{1}{\log N}\sum_{n=1}^{N} \frac {1}{n}I \biggl\{ \biggl(\frac{\prod_{k=1}^{n} S_{k}}{n!\mu^{n}} \biggr)^{\frac {1}{\gamma\sqrt{n}}}\leq x \biggr\} =F(x) \quad \textit{a.s. for any } x \in\mathbb{R}, $$
(1.2)

where F is the distribution function of the random variable \(e^{\sqrt {2}\mathcal{N}}\). Here and in the sequel, \(I \{\cdot \}\) denotes the indicator function.

Tan and Peng [3] proved the result of Theorem B still holds for some class of unbounded measurable functions and obtained the following result.

Theorem C

Let \(\{X_{n};n\geq1\}\) be a sequence of i.i.d. positive random variables with \(\mathbb{E}X_{1}=\mu\), \(\operatorname {Var}(X_{1})=\sigma^{2}\), \(\mathbb{E}|X_{1}|^{3}<\infty\), the coefficient of variation \(\gamma =\sigma/\mu\). Let \(g(x)\) be a real valued almost everywhere continuous function on \(\mathbb{R}\) such that \(|g(e^{x})\phi(x)|\leq c(1+|x|)^{-\alpha}\) with some \(c>0\) and \(\alpha>5\). Then

$$ \lim_{N \to\infty}\frac{1}{\log N}\sum_{n=1}^{N} \frac {1}{n}g \biggl\{ \biggl(\frac{\prod_{k=1}^{n} S_{k}}{n!\mu^{n}} \biggr)^{\frac {1}{\gamma\sqrt{n}}} \biggr\} = \int_{0}^{\infty}g(x)\,\mathrm{d}F(x) \quad \textit{a.s. for any } x \in\mathbb{R}, $$
(1.3)

where \(F(\cdot)\) is the distribution function of the random variable \(e^{\sqrt {2}\mathcal{N}}\) and \(\phi(x)\) is the density function of the standard normal random variable.

Zhang et al. [4] discussed the almost sure central limit theory for products of sums of partial sums and obtained the following result.

Theorem D

Let \(\{X,X_{n};n\geq1\}\) be a sequence of i.i.d. positive square integrable random variables with \(\mathbb{E}X=\mu\), \(\operatorname{Var}(X)=\sigma^{2}<\infty\), the coefficient of variation \(\gamma =\sigma/\mu\). Denote \(S_{n}=\sum_{i=1}^{n} X_{i}\), \(T_{k}=\sum_{i=1}^{k} S_{i}\). Then

$$ \lim_{n \to\infty}\frac{1}{\log n}\sum_{k=1}^{n} \frac {1}{k}I \biggl\{ \biggl(\frac{2^{k}\prod_{j=1}^{k} T_{j}}{k!(k+1)!\mu^{k}} \biggr)^{\frac{1}{\gamma\sqrt{k}}}\leq x \biggr\} =F(x) \quad \textit{a.s. for any } x \in\mathbb{R}, $$
(1.4)

where \(F(\cdot)\) is the distribution function of the random variable \(e^{\sqrt{10/3}\mathcal{N}}\).

The purpose of this article is to establish that Theorem D holds for some class of unbounded measurable functions.

Our main result is the following theorem.

Theorem 1.1

Let \(\{X_{n};n\geq1\}\) be a sequence of i.i.d. positive random variables with \(\mathbb{E}X_{1}=\mu\), \(\operatorname {Var}(X_{1})=\sigma^{2}\), \(\mathbb{E}|X_{1}|^{3}<\infty\), the coefficient of variation \(\gamma =\sigma/\mu\). Let \(g(x)\) be a real valued almost everywhere continuous function on \(\mathbb{R}\) such that \(|g(e^{\sqrt{10/3} x})\phi(x)|\leq c(1+|x|)^{-\alpha}\) with some \(c>0\) and \(\alpha>5\). Denote \(S_{n}=\sum_{i=1}^{n} X_{i}\), \(T_{k}=\sum_{i=1}^{k} S_{i}\). Then

$$\begin{aligned}& \lim_{N \to\infty}\frac{1}{\log N}\sum_{n=1}^{N} \frac {1}{n}g \biggl( \biggl(\frac{2^{n}\prod_{k=1}^{n} T_{k}}{n!(n+1)!{\mu }^{n}} \biggr)^{\frac{1}{\gamma\sqrt{n}}} \biggr) \\& \quad = \int_{0}^{\infty }g(x)\,\mathrm{d}F(x) \quad \textit{a.s. for any } x \in\mathbb{R}, \end{aligned}$$
(1.5)

where \(F(\cdot)\) is the distribution function of the random variable \(e^{\sqrt{10/3}\mathcal{N}}\). Here and in the sequel, \(\phi(x)\) is the density function of the standard normal random variable.

Remark 1

Let \(f(x)=g(e^{\sqrt{10/3} x})\), \(t=e^{\sqrt{10/3} x}\). Then

$$\begin{aligned}& x=\sqrt{\frac{3}{10}}\log t, \qquad g(t)=f \biggl(\sqrt {\frac{3}{10}} \log t \biggr), \\& g \biggl( \biggl(\frac{2^{n}\prod_{k=1}^{n} T_{k}}{n!(n+1)!{\mu}^{n}} \biggr)^{\frac{1}{\gamma\sqrt{n}}} \biggr) =f \Biggl( \sqrt{\frac{3}{10}}\log \Biggl(\prod_{k=1}^{n} \frac {2T_{k}}{k(k+1)\mu} \Biggr)^{\frac{1}{\gamma\sqrt{n}}} \Biggr) \\& \hphantom{g \biggl( \biggl(\frac{2^{n}\prod_{k=1}^{n} T_{k}}{n!(n+1)!{\mu}^{n}} \biggr)^{\frac{1}{\gamma\sqrt{n}}} \biggr)}=f \Biggl(\frac{1}{\gamma\sqrt{10n/3}}\sum _{k=1}^{n} \log \frac {T_{k}}{k(k+1)\mu/2} \Biggr). \end{aligned}$$

Since \(F(x)\) is the distribution function of the random variable \(e^{\sqrt{10/3}\mathcal{N}}\), we can get \(F(x)=\Phi (\sqrt{\frac{3}{10}}\log x )\), where \(\Phi (x)\) is the distribution function of the standard normal random variable. Hence we have the following: Let \(f(x)=g(e^{\sqrt{10/3} x})\) and \(f(x)\) be a real valued almost everywhere continuous function on \(\mathbb{R}\) such that \(|f(x)\phi(x)|\leq c(1+|x|)^{-\alpha}\) with some \(c>0\) and \(\alpha>5\), then (1.5) is equivalent to

$$\begin{aligned}& \lim_{N \to\infty}\frac{1}{\log N}\sum_{n=1}^{N} \frac {1}{n}f \Biggl(\frac{1}{\gamma\sqrt{10n/3}}\sum_{k=1}^{n} \log \frac {T_{k}}{k(k+1)\mu/2} \Biggr) \\& \quad = \int_{-\infty}^{\infty}f(x)\phi(x)\,\mathrm {d}x\quad \mbox{a.s. for any } x \in\mathbb{R}. \end{aligned}$$
(1.6)

Remark 2

By the proof of Theorem 2 of Berkes et al. [5], in order to prove (1.5), it suffices to show (1.6) holds true for \(f(x)\phi(x)=(1+|x|)^{-\alpha}\) with \(\alpha>5\). Here and in the sequel, \(f(x)\) satisfies \(f(x)\phi(x)=(1+|x|)^{-\alpha}\) with \(\alpha>5\).

2 Preliminaries

In the following, the notation \(a_{n}\sim b_{n}\) means that \(\lim_{n \to\infty}a_{n}/ b_{n}= 1\) and \(a_{n}\ll b_{n}\) means that \(\limsup_{n \to\infty}|a_{n}/ b_{n}|<+\infty\). We denote \(b_{k,n}=\sum_{j=k}^{n} \frac {1}{j}\), \(c_{k,n}=2\sum_{j=k}^{n} \frac{j+1-k}{j(j+1)}\), \(d_{k,n}=\frac {n+1-k}{n+1}\), \(\widetilde{X}_{i}=\frac{{X}_{i}-\mu}{\sigma}\), \(\widetilde{S}_{k}=\sum_{i=1}^{k}\widetilde{X}_{i}\), \({S}_{k,n}=\sum_{i=1}^{k} c_{i,n}\widetilde{X}_{i}\). By Lemma 2.1 of Wu [6], we can get

$$c_{i,n}=2(b_{i,n}-d_{i,n}), \qquad \sum _{i=1}^{n} c_{i,n}^{2} \sim \frac{10n}{3}. $$

Let

$$Y_{i}=\frac{1}{\gamma\sqrt{10i/3}}\sum_{k=1}^{i} \log \frac {T_{k}}{k(k+1)\mu/2}. $$

Note that

$$\begin{aligned}& \frac{1}{\gamma}\sum_{k=1}^{i} \biggl( \frac{T_{k}}{k(k+1)\mu/2}-1 \biggr) \\& \quad =\frac{1}{\gamma}\sum_{k=1}^{i} \biggl(\frac{2\sum_{j=1}^{k} S_{j}-k(k+1)\mu }{k(k+1)\mu} \biggr) \\& \quad =\frac{1}{\gamma}\sum_{k=1}^{i} \frac{2}{k(k+1)\mu}\sum_{j=1}^{k}\sum _{l=1}^{j}(X_{l}-\mu) \\& \quad =\frac{1}{\gamma}\sum_{k=1}^{i} \frac{2}{k(k+1)\mu}\sum_{l=1}^{k}\sum _{j=l}^{k}(X_{l}-\mu) \\& \quad =\frac{\mu}{\sigma}\sum_{k=1}^{i} \frac{2}{k(k+1)\mu}\sum_{l=1}^{k}(k+1-l) (X_{l}-\mu) \\& \quad =\sum_{k=1}^{i}\sum _{l=1}^{k} \frac{2(k+1-l)}{k(k+1)}\frac{X_{l}-\mu }{\sigma} \\& \quad =\sum_{l=1}^{i} \sum _{k=l}^{i}\frac{2(k+1-l)}{k(k+1)}\widetilde{X}_{l} \\& \quad =\sum_{l=1}^{i} c_{l,i} \widetilde{X}_{l}=S_{i,i}. \end{aligned}$$

By the fact that \(\log (1+x)=x+\frac{\delta}{2}x^{2}\), where \(|x|<1\), \(\delta\in(-1,0)\), thus we have

$$\begin{aligned} Y_{i}&=\frac{1}{\gamma\sqrt{10i/3}}\sum_{k=1}^{i} \log \frac {T_{k}}{k(k+1)\mu/2} \\ &=\frac{1}{\gamma\sqrt{10i/3}}\sum_{k=1}^{i} \biggl( \frac{T_{k}}{k(k+1)\mu /2}-1 \biggr)+\frac{1}{\gamma\sqrt{10i/3}} \sum_{k=1}^{i} \frac{\delta_{k}}{2} \biggl(\frac{T_{k}}{k(k+1)\mu/2}-1 \biggr)^{2} \\ &=\frac{1}{\sqrt{10i/3}}S_{i,i}+\frac{1}{\gamma\sqrt{10i/3}}\sum _{k=1}^{i}\frac{\delta_{k}}{2} \biggl( \frac{T_{k}}{k(k+1)\mu/2}-1 \biggr)^{2} \\ &=:\frac{1}{\sqrt{10i/3}}S_{i,i}+R_{i}. \end{aligned}$$

By the fact that \(\mathbb{E}|X_{1}|^{2}<\infty\), using the Marcinkiewicz-Zygmund strong large number law, we have

$$\begin{aligned}& S_{k}-k\mu=o \bigl(k^{1/2} \bigr) \quad \mbox{a.s.}, \\& \biggl\vert \frac{T_{k}}{k(k+1)\mu/2}-1 \biggr\vert = \biggl\vert \frac{2\sum_{j=1}^{k} S_{j}-k(k+1)\mu}{k(k+1)\mu} \biggr\vert \\& \hphantom{\biggl\vert \frac{T_{k}}{k(k+1)\mu/2}-1 \biggr\vert }\leq\frac{2\vert \sum_{j=1}^{k} (S_{j}-j\mu)\vert }{k(k+1)\mu} \\& \hphantom{\biggl\vert \frac{T_{k}}{k(k+1)\mu/2}-1 \biggr\vert }\leq\frac{2\sum_{j=1}^{k} j^{1/2}}{k(k+1)\mu}\ll\frac {k^{3/2}}{k^{2}}=\frac{1}{k^{1/2}}. \end{aligned}$$

Thus

$$ |R_{i}|\ll\frac{1}{\sqrt{i}}\sum_{k=1}^{i} \frac{1}{k}\ll\frac{\log i}{\sqrt{i}}\quad \mbox{a.s.} $$
(2.1)

In order to prove Theorem 1.1, we introduce the following lemmas.

Lemma 2.1

Let X and Y be random variables. Set \(F(x)=P(X< x)\), \(G(x)=P(X+Y< x)\), then for any \(\varepsilon>0\) and \(x\in\mathbb{R}\),

$$ F(x-\varepsilon)-P \bigl(\vert Y\vert \geq\varepsilon \bigr)\leq G(x)\leq F(x+ \varepsilon )+P \bigl(\vert Y\vert \geq\varepsilon \bigr). $$

Proof

See Lemma 3 on p.16 of Petrov [7]. □

Lemma 2.2

Let \(\{X_{n};n\geq1\}\) be a sequence of i.i.d. positive random variables. Denote \(S_{n}=\sum_{i=1}^{n} X_{i}\), \(F^{s}\) denotes the distribution function obtained from F by symmetrization and choose \(L >0\) so large that \(\int_{|x|\leq L}x^{2}\,\mathrm{d}F^{s}(x)\geq1\). Then, for any \(n\geq1\), \(\lambda>0\), there exists a \(c>0\) such that

$$\sup_{a}P \biggl(a\leq\frac{S_{n}}{\sqrt{n}}\leq a+\lambda \biggr) \leq c\lambda $$

holds for \(\lambda\sqrt{n}\geq L\).

Proof

See (20) on p.73 of Berkes et al. [5]. □

Let

$$\begin{aligned}& Z_{k}=\sum_{i=2^{k}+1}^{2^{k+1}} \frac{1}{i}f(Y_{i}), \\& Z_{k}^{*}=\sum_{i=2^{k}+1}^{2^{k+1}} \frac{1}{i}f(Y_{i})I \biggl\{ f(Y_{i})\leq \frac{k}{(\log k)^{\beta}} \biggr\} , \end{aligned}$$

where \(1<\beta<(\alpha-3)/2\).

Lemma 2.3

Under the conditions of Theorem  1.1, we get

$$\mathbb{P} \bigl(Z_{k}\neq Z_{k}^{*}, \mathrm{i.o.} \bigr)=0. $$

Proof

It is easy to get

$$\begin{aligned} \bigl\{ Z_{k}\neq Z_{k}^{*} \bigr\} \subseteq& \bigl\{ |Y_{i}|\geq f^{-1} \bigl(k/(\log k)^{\beta} \bigr) \mbox{ for some } 2^{k}< i\leq2^{k+1} \bigr\} \\ =& \biggl\{ \biggl\vert \frac{1}{\sqrt{10i/3}}S_{i,i}+R_{i} \biggr\vert \geq f^{-1} \bigl(k/(\log k)^{\beta} \bigr)\geq \bigl(2\log k+(\alpha-2\beta)\log \log k \bigr)^{1/2} \\ &\mbox{for some } 2^{k}< i\leq2^{k+1} \biggr\} . \end{aligned}$$

Since \(|R_{i}|\ll\frac{\log i}{\sqrt{i}}\) a.s.; see (2.1). By the law of iterated logarithm (Feller [8], Theorem 2), we get

$$\begin{aligned} \mathbb{P} \bigl(Z_{k}\neq Z_{k}^{*}, \mathrm{i.o.} \bigr) \leq& \mathbb{P} \biggl( \biggl\vert \frac {1}{\sqrt{10i/3}}S_{i,i} \biggr\vert \geq \bigl(2\log \log i+(\alpha-2\beta)\log \log \log i-O(1) \bigr)^{1/2}, \mathrm{i.o.} \biggr) \\ =&0. \end{aligned}$$

We complete the proof of Lemma 2.3. □

Let \(G_{i}\), \(F_{i}\), F denote the distribution functions of \(Y_{i}\), \(\frac {\widetilde{S}_{i}}{\sqrt{i}}\), \(\widetilde{X}_{1}\), respectively. Φ denotes the distribution function of the standard normal distribution function. Set

$$\begin{aligned}& \sigma_{i}^{2} = \int_{-\sqrt{i}}^{\sqrt{i}}x^{2}\,\mathrm{d}F(x)- \biggl( \int _{-\sqrt{i}}^{\sqrt{i}}x\,\mathrm{d}F(x) \biggr)^{2}, \\& \varepsilon_{i} =\sup_{x} \biggl\vert F_{i}(x)-\Phi \biggl(\frac{x}{\sigma_{i}} \biggr) \biggr\vert ,\qquad \theta_{i}=\sup_{x} \biggl\vert G_{i}(x)-\Phi \biggl(\frac{x}{\sigma_{i}} \biggr) \biggr\vert . \end{aligned}$$

Obviously \(\sigma_{i}\leq1\), \(\lim_{i \to\infty}\sigma_{i}= 1\).

Lemma 2.4

Under the conditions of Theorem  1.1, we have

$$\sum_{k=1}^{N}\mathbb{E} \bigl(Z_{k}^{*} \bigr)^{2}\ll\frac{N^{2}}{(\log N)^{2\beta}}. $$

Proof

Note that the estimation

$$ \biggl\vert \int_{-a}^{a}\Psi(x)\,\mathrm{d} \bigl(H_{1}(x)-H_{2}(x) \bigr) \biggr\vert \leq\sup _{-a\leq x \leq a} \bigl\vert \Psi(x) \bigr\vert \cdot\sup _{-a\leq x \leq a} \bigl\vert H_{1}(x)-H_{2}(x) \bigr\vert $$
(2.2)

holds for any bounded, measurable function \(\Psi(x)\) and the distribution functions \(H_{1}(x)\), \(H_{2}(x)\). Thus for \(2^{k}< i\leq 2^{k+1}\), we get

$$\begin{aligned}& \mathbb{E}f^{2}(Y_{i})I \biggl\{ f(Y_{i})\leq \frac{k}{(\log k)^{\beta }} \biggr\} \\& \quad = \int_{|x|\leq a_{k}}f^{2}(x)\,\mathrm{d}G_{i}(x) \\& \quad \leq \int_{|x|\leq a_{k}}f^{2}(x)\,\mathrm{d}\Phi \biggl( \frac{x}{\sigma _{i}} \biggr)+\theta_{i}\frac{k^{2}}{(\log k)^{2\beta}} \\& \quad \ll \int_{|x|\leq a_{k}}f^{2}(x)\,\mathrm{d}\Phi(x)+ \theta_{i}\frac {k^{2}}{(\log k)^{2\beta}}; \end{aligned}$$

here and in the sequel \(a_{k}=f^{-1}(\frac{k}{(\log k)^{\beta }})\). Hence, by the Cauchy-Schwarz inequality and the fact that \(f(x)\phi(x)=(1+|x|)^{-\alpha}\), we obtain

$$\begin{aligned} \mathbb{E} \bigl(Z_{k}^{*} \bigr)^{2} &\ll\mathbb{E} \Biggl( \Biggl(\sum_{i=2^{k}+1}^{2^{k+1}} \biggl( \frac{1}{i} \biggr)^{2} \Biggr)^{1/2} \Biggl(\sum _{i=2^{k}+1}^{2^{k+1}}f^{2}(Y_{i})I \biggl\{ f(Y_{i}) \leq\frac{k}{(\log k)^{\beta}} \biggr\} \Biggr)^{1/2} \Biggr)^{2} \\ &\ll \Biggl(\sum_{i=2^{k}+1}^{2^{k+1}} \frac{1}{i^{2}} \Biggr) \Biggl(\sum_{i=2^{k}+1}^{2^{k+1}} \biggl( \int_{|x|\leq a_{k}}f^{2}(x)\,\mathrm{d}\Phi (x)+ \theta_{i}\frac{k^{2}}{(\log k)^{2\beta}} \biggr) \Biggr) \\ &\ll\frac{1}{2^{k}} \Biggl(2^{k} \int_{|x|\leq a_{k}}f^{2}(x)\,\mathrm{d}\Phi (x)+ \frac{k^{2}}{(\log k)^{2\beta}}\sum_{i=2^{k}+1}^{2^{k+1}}\theta _{i} \Biggr) \\ &\ll \int_{|x|\leq a_{k}}\frac{e^{x^{2}/2}}{(1+|x|)^{2\alpha}}\,\mathrm {d}x+\frac{k^{2}}{(\log k)^{2\beta}} \sum_{i=2^{k}+1}^{2^{k+1}}\frac {\theta_{i}}{i}. \end{aligned}$$

By the same methods as that on p.72 of Berkes et al. [5], we get

$$\int_{|x|\leq a_{k}}\frac{e^{x^{2}/2}}{(1+|x|)^{2\alpha}}\,\mathrm{d}x\ll \frac{k}{(\log k)^{\beta+(\alpha+1)/2}}. $$

Now we estimate \(\theta_{i}\). By Lemma 2.1, for any \(\varepsilon>0\), we have

$$\begin{aligned} \theta_{i} =&\sup_{x} \biggl\vert G_{i}(x)-\Phi \biggl(\frac{x}{\sigma_{i}} \biggr) \biggr\vert \\ \leq&\sup_{x} \bigl\vert G_{i}(x)-F_{i}(x) \bigr\vert +\sup_{x} \biggl\vert F_{i}(x)-\Phi \biggl(\frac{x}{\sigma_{i}} \biggr) \biggr\vert \\ =&\sup_{x} \biggl\vert P(Y_{i}\leq x)-P \biggl( \frac{\widetilde{S}_{i}}{\sqrt {i}}\leq x \biggr) \biggr\vert +\varepsilon_{i} \\ \leq&\sup_{x} \biggl\vert P(Y_{i}\leq x)-P \biggl(\frac{S_{i,i}}{\sqrt {10i/3}}\leq x \biggr) \biggr\vert + \sup_{x} \biggl\vert P \biggl(\frac{S_{i,i}}{\sqrt{10i/3}}\leq x \biggr)-P \biggl( \frac{\widetilde{S}_{i}}{\sqrt{i}}\leq x \biggr) \biggr\vert +\varepsilon _{i} \\ \leq&\sup_{x} \biggl\vert P \biggl(\frac{S_{i,i}}{\sqrt{10i/3}}+R_{i} \leq x \biggr)-P \biggl(\frac{S_{i,i}}{\sqrt{10i/3}}\leq x+\varepsilon \biggr) \biggr\vert \\ &{} +\sup_{x} \biggl\vert P \biggl(\frac{S_{i,i}}{\sqrt{10i/3}}\leq x+\varepsilon \biggr)-P \biggl(\frac{S_{i,i}}{\sqrt{10i/3}}\leq x \biggr) \biggr\vert \\ &{} +\sup_{x} \biggl\vert P \biggl(\frac{S_{i,i}}{\sqrt{10i/3}}\leq x \biggr)-P \biggl(\frac{\widetilde{S}_{i}}{\sqrt{i}}\leq x \biggr) \biggr\vert + \varepsilon_{i} \\ \leq& P \bigl(\vert R_{i}\vert \geq\varepsilon \bigr)+\sup _{x} \biggl\vert P \biggl(\frac {S_{i,i}}{\sqrt{10i/3}}\leq x+ \varepsilon \biggr)-P \biggl(\frac {S_{i,i}}{\sqrt{10i/3}}\leq x \biggr) \biggr\vert \\ &{}+ \sup_{x} \biggl\vert P \biggl(\frac{S_{i,i}}{\sqrt{10i/3}}\leq x \biggr)-P \biggl(\frac{\widetilde{S}_{i}}{\sqrt{i}}\leq x \biggr) \biggr\vert + \varepsilon_{i}. \end{aligned}$$

By the Markov inequality and (2.1), we have

$$P \bigl(\vert R_{i}\vert \geq\varepsilon \bigr)\leq \frac{\mathbb{E}|R_{i}|}{\varepsilon}\ll \frac{\log i}{\sqrt{i}\varepsilon}. $$

By Lemma 2.2, we have

$$\sup_{x} \biggl\vert P \biggl(\frac{S_{i,i}}{\sqrt{10i/3}}\leq x+ \varepsilon \biggr)-P \biggl(\frac{S_{i,i}}{\sqrt{10i/3}}\leq x \biggr) \biggr\vert \ll \varepsilon. $$

By the Berry-Esseen inequality, we have

$$\begin{aligned}& \sup_{x} \biggl\vert P \biggl(\frac{S_{i,i}}{\sqrt{10i/3}}\leq x \biggr)-P \biggl(\frac{\widetilde{S}_{i}}{\sqrt{i}}\leq x \biggr) \biggr\vert \\& \quad \leq\sup_{x} \biggl\vert P \biggl(\frac{S_{i,i}}{\sqrt{10i/3}} \leq x \biggr)-\Phi(x) \biggr\vert +\sup_{x} \biggl\vert P \biggl(\frac{\widetilde{S}_{i}}{\sqrt {i}}\leq x \biggr)-\Phi(x) \biggr\vert \\& \quad \ll\frac{1}{i^{1/2}}+\frac{1}{i^{1/2}}. \end{aligned}$$

Let \(\varepsilon=i^{-1/3}\), then

$$\theta_{i}\ll\frac{\log i}{i^{1/6}}+ \frac{1}{i^{1/3}}+ \frac {1}{i^{1/2}}+\varepsilon_{i}. $$

Therefore, there exists \(\varepsilon_{0}>0\) such that

$$\theta_{i}\ll\frac{1}{i^{\varepsilon_{0}}}+\varepsilon_{i}. $$

By Theorem 1 of Friedman et al. [9], we have

$$\sum_{i=1}^{\infty}\frac{\varepsilon_{i}}{i}< \infty. $$

Hence

$$\sum_{i=1}^{\infty}\frac{\theta_{i}}{i}\ll\sum _{i=1}^{\infty}\frac{\frac {1}{i^{\varepsilon_{0}}}+\varepsilon_{i}}{i}< \infty. $$

By the fact that \((\alpha+1)/2>\beta\), we have

$$\sum_{k=1}^{N}\mathbb{E} \bigl(Z_{k}^{*} \bigr)^{2}\ll\sum_{k=1}^{N} \frac{k}{(\log k)^{\beta+(\alpha+1)/2}} +\sum_{k=1}^{N} \frac{k^{2}}{(\log k)^{2\beta}}\sum_{i=2^{k}+1}^{2^{k+1}} \frac{\theta_{i}}{i}\ll\frac{N^{2}}{(\log N)^{2\beta}}. $$

We complete the proof of Lemma 2.4. □

Lemma 2.5

Under the conditions of Theorem  1.1, for \(l\geq l_{0}\), we have

$$\bigl\vert \operatorname{Cov} \bigl(Z_{k}^{*},Z_{l}^{*} \bigr) \bigr\vert \ll\frac{kl}{(\log k)^{\beta }(\log l)^{\beta}}2^{-(l-k)\tau}, $$

where Ï„ is a constant \(0<\tau\leq1/8\).

Proof

For \(1\leq i \leq j/2\), \(j\geq j_{0}\) and any x, y, we first prove

$$ \bigl\vert P(Y_{i}\leq x,Y_{j}\leq y)-P(Y_{i} \leq x)P(Y_{j}\leq y) \bigr\vert \ll \biggl(\frac {i}{j} \biggr)^{\tau}. $$
(2.3)

Let \(\rho=\frac{i}{j}\). By the Chebyshev inequality, we have

$$P \biggl( \biggl\vert \frac{S_{i,i}}{\sqrt{10j/3}} \biggr\vert \geq\rho^{1/8} \biggr) =P \biggl( \biggl\vert \frac{S_{i,i}}{\sqrt{10i/3}} \biggr\vert \geq\sqrt{ \frac {j}{i}} {\rho}^{1/8} \biggr)\leq\frac{i}{j} \rho^{-1/4}\leq{\rho }^{1/8}\leq\rho^{\tau_{1}}, $$

where \(\tau_{1}\) is a constant \(0<\tau_{1}\leq1/8\).

By the Markov inequality and (2.1), for \(j\geq j_{0}\), we have

$$P \bigl(\vert R_{j}\vert \geq\rho^{1/8} \bigr)\leq \frac{\mathbb{E}|R_{j}|}{\rho^{1/8}}\ll \frac{\log j}{j^{1/2}\rho^{1/8}} =\rho^{1/8}\frac{\log j}{j^{1/4}i^{1/4}}\ll \rho^{\tau_{2}}, $$

where \(\tau_{2}\) is a constant, \(0<\tau_{2}\leq1/8\).

By the Markov inequality, we have

$$\begin{aligned}& P \biggl( \biggl\vert \sqrt{1-\rho}\frac{c_{i+1,j}\widetilde{S}_{i}}{\sqrt {10(j-i)/3}} \biggr\vert \geq \rho^{1/8} \biggr) \\& \quad =P \biggl( \biggl\vert \frac{\widetilde{S}_{i}}{\sqrt{i}} \biggr\vert \geq\sqrt{ \frac {10/3j}{i}}\frac{1}{c_{i+1,j}}{\rho}^{1/8} \biggr) \\& \quad \leq\frac{3}{10}{\rho}^{3/4}(c_{i+1,j})^{2}= \frac{3}{10}{\rho }^{3/4}(2b_{i+1,j}-2d_{i+1,j})^{2} \\& \quad =\frac{3}{10}{\rho}^{3/4} \Biggl[ \Biggl(2\sum _{k=i+1}^{j}\frac{1}{k} \Biggr)^{2}+4 \biggl(\frac {j+1-i-1}{j+1} \biggr)^{2}-8 \Biggl(\sum _{k=i+1}^{j}\frac{1}{k} \Biggr) \frac {j+1-i-1}{j+1} \Biggr] \\& \quad \ll{\rho}^{3/4} \biggl[ \biggl(\log \frac{j}{i} \biggr)^{2}+ \biggl(\frac {j-i}{j+1} \biggr)^{2}- \frac{j-i}{j+1}\log \frac{j}{i} \biggr] \\& \quad \ll\rho^{\tau_{3}}, \end{aligned}$$

where \(\tau_{3}\) is a constant that satisfies \(0<\tau_{3}\leq1/8\).

By Lemma 2.2 and the fact that \(\rho=\frac{i}{j}\), \(1\leq i \leq j/2\), we have

$$P \biggl(y-3{\rho}^{1/8}\leq\sqrt{1-\rho}\frac {S_{j,j}-S_{i,i}-c_{i+1,j}\widetilde{S}_{i}}{\sqrt{10(j-i)/3}}\leq y \biggr)\ll\frac{{\rho}^{1/8}}{\sqrt{1-\rho}}\ll\rho^{1/8}. $$

Set \(\tau=\min\{\tau_{1},\tau_{2},\tau_{3},1/8\}\), we get

$$\begin{aligned}& P(Y_{i}\leq x,Y_{j}\leq y) \\& \quad =P \biggl(Y_{i}\leq x,\frac{S_{j,j}}{\sqrt{10j/3}}+R_{j}\leq y \biggr) \\& \quad =P \biggl(Y_{i}\leq x,\frac{S_{i,i}}{\sqrt{10j/3}}+\sqrt{1-\rho} \frac {S_{j,j}-S_{i,i}-c_{i+1,j}\widetilde{S}_{i}}{\sqrt{10(j-i)/3}} +\sqrt{1-\rho}\frac{c_{i+1,j}\widetilde{S}_{i}}{\sqrt {10(j-i)/3}}+R_{j}\leq y \biggr) \\& \quad \geq P \biggl(Y_{i}\leq x,\sqrt{1-\rho}\frac {S_{j,j}-S_{i,i}-c_{i+1,j}\widetilde{S}_{i}}{\sqrt{10(j-i)/3}} \leq y \biggr) \\& \qquad {} -P \biggl(y-3{\rho}^{1/8}\leq\sqrt{1-\rho}\frac {S_{j,j}-S_{i,i}-c_{i+1,j}\widetilde{S}_{i}}{\sqrt{10(j-i)/3}} \leq y \biggr)-P \biggl( \biggl\vert \frac{S_{i,i}}{\sqrt{10j/3}} \biggr\vert \geq\rho ^{1/8} \biggr) \\& \qquad {} -P \biggl( \biggl\vert \sqrt{1-\rho}\frac{c_{i+1,j}\widetilde{S}_{i}}{\sqrt {10(j-i)/3}} \biggr\vert \geq\rho^{1/8} \biggr) -P \bigl(\vert R_{j}\vert \geq \rho^{1/8} \bigr) \\& \quad \geq P \biggl(Y_{i}\leq x,\sqrt{1-\rho}\frac {S_{j,j}-S_{i,i}-c_{i+1,j}\widetilde{S}_{i}}{\sqrt{10(j-i)/3}} \leq y \biggr)-\rho^{\tau} \\& \quad = P(Y_{i}\leq x)P \biggl(\sqrt{1-\rho}\frac {S_{j,j}-S_{i,i}-c_{i+1,j}\widetilde{S}_{i}}{\sqrt{10(j-i)/3}} \leq y \biggr)-\rho^{\tau}. \end{aligned}$$

We can get a similar upper estimate for \(P(Y_{i}\leq x,Y_{j}\leq y)\) in the same way. Thus there exists some constant M such that

$$P(Y_{i}\leq x,Y_{j}\leq y)=P(Y_{i}\leq x)P \biggl(\sqrt{1-\rho}\frac {S_{j,j}-S_{i,i}-c_{i+1,j}\widetilde{S}_{i}}{\sqrt{10(j-i)/3}} \leq y \biggr)+M\rho^{\tau}. $$

A similar argument,

$$P(Y_{i}\leq x)P(Y_{j}\leq y)=p(Y_{i}\leq x)P \biggl(\sqrt{1-\rho}\frac {S_{j,j}-S_{i,i}-c_{i+1,j}\widetilde{S}_{i}}{\sqrt{10(j-i)/3}} \leq y \biggr)+M' \rho^{\tau}, $$

holds for some constant \(M'\). Thus we prove that (2.3) holds.

Let \(G_{i,j}(x,y)\) be the joint distribution function of \(Y_{i}\) and \(Y_{j}\). By (2.2) and (2.3), for \(2^{k}< i\leq2^{k+1}\), \(2^{l}< j\leq 2^{l+1}\), \(l-k\geq3\), \(l\geq l_{0}\), we can get

$$\begin{aligned}& \biggl\vert \operatorname{Cov} \biggl(f(Y_{i})I \biggl\{ f(Y_{i})\leq\frac{k}{(\log k)^{\beta}} \biggr\} ,f(Y_{j})I \biggl\{ f(Y_{j})\leq\frac{l}{(\log l)^{\beta}} \biggr\} \biggr) \biggr\vert \\& \quad = \biggl\vert \int_{|x|\leq a_{k}} \int_{|y|\leq a_{l}}f(x)f(y)\,\mathrm {d} \bigl(G_{i,j}(x,y)-G_{i}(x)G_{j}(y) \bigr) \biggr\vert \\& \quad \ll\frac{kl}{(\log k)^{\beta}(\log l)^{\beta}} \biggl(\frac {i}{j} \biggr)^{\tau} \ll \frac{kl}{(\log k)^{\beta}(\log l)^{\beta }}2^{-(l-k-1)\tau}. \end{aligned}$$

Thus we have

$$\bigl\vert \operatorname{Cov} \bigl(Z_{k}^{*},Z_{l}^{*} \bigr) \bigr\vert \ll\frac{kl}{(\log k)^{\beta }(\log l)^{\beta}}2^{-(l-k)\tau}. $$

We complete the proof of Lemma 2.5. □

Lemma 2.6

Under the conditions of Theorem  1.1, denoting \(\eta _{k}=Z_{k}^{*}-\mathbb{E}Z_{k}^{*}\), we have

$$\mathbb{E} \Biggl(\sum_{k=1}^{N} \eta_{k} \Biggr)^{2}=O \biggl(\frac{N^{2}}{(\log N)^{2\beta-1}} \biggr). $$

Proof

It follows from Lemma 2.4 and Lemma 2.5 that Lemma 2.6 also holds true. The proof is similar to that of Lemma 4 of Berkes et al. [5]. So we omit it here. □

3 Proof of theorem

By Lemma 2.6, we have

$$\mathbb{E} \Biggl(\frac{1}{N}\sum_{k=1}^{N} \eta_{k} \Biggr)^{2}=O \bigl((\log N)^{1-2\beta} \bigr). $$

Letting \(N_{k}=[e^{k\lambda}]\), \((2\beta-1)^{-1}<\lambda<1\), we get

$$\mathbb{E} \Biggl(\frac{1}{N_{k}}\sum_{k=1}^{N_{k}} \eta_{k} \Biggr)^{2}< \infty, $$

which implies

$$ \lim_{k \to\infty}\frac{1}{N_{k}}\sum_{k=1}^{N_{k}}{ \eta}_{k}=0\quad \mbox{a.s.} $$
(3.1)

Note that for \(2^{k}< i\leq2^{k+1}\),

$$\begin{aligned}& \mathbb{E}f(Y_{i})I \biggl\{ f(Y_{i})\leq\frac{k}{(\log k)^{\beta }} \biggr\} \\& \quad = \int_{|x|\leq a_{k}}f(x)\,\mathrm{d}G_{i}(x) = \int_{|x|\leq a_{k}}f(x)\,\mathrm{d}\Phi \biggl(\frac{x}{\sigma_{i}} \biggr)+ \int _{|x|\leq a_{k}}f(x)\,\mathrm{d} \biggl(G_{i}(x)-\Phi \biggl(\frac{x}{\sigma_{i}} \biggr) \biggr). \end{aligned}$$
(3.2)

Set \(a=\int_{-\infty}^{\infty}f(x)\,\mathrm{d}\Phi(x)\). Noting that \(\sigma_{i}\leq1\), \(\lim_{i \to\infty}\sigma_{i}=1\), we have

$$ \lim_{k \to\infty}\sup_{2^{k}< i\leq2^{k+1}} \biggl\vert \int_{|x|\leq a_{k}}f(x)\,\mathrm{d}\Phi \biggl(\frac{x}{\sigma_{i}} \biggr)-a \biggr\vert =0. $$
(3.3)

Then by (3.2), (3.3), and (2.2) we get

$$\begin{aligned}& \biggl\vert \mathbb{E}f(Y_{i})I \biggl\{ f(Y_{i})\leq \frac{k}{(\log k)^{\beta}} \biggr\} -a \biggr\vert \\& \quad \leq \biggl\vert \int_{|x|\leq a_{k}}f(x)\,\mathrm{d}\Phi \biggl(\frac{x}{\sigma_{i}} \biggr)-a \biggr\vert + \biggl\vert \int _{|x|\leq a_{k}}f(x)\,\mathrm{d} \biggl(G_{i}(x)-\Phi \biggl(\frac{x}{\sigma _{i}} \biggr) \biggr) \biggr\vert \\& \quad \leq o_{k}(1)+\frac{k\theta_{i}}{(\log k)^{\beta}}. \end{aligned}$$

Thus

$$\mathbb{E}Z_{k}^{*}=a\sum_{i=2^{k}+1}^{2^{k+1}} \frac{1}{i}+\zeta_{k}\frac {k}{(\log k)^{\beta}}\sum _{i=2^{k}+1}^{2^{k+1}} \frac{\theta_{i}}{i}+o_{k}(1), \quad |\zeta_{k}|\leq1. $$

Using \(\sum_{i=1}^{L}1/i=\log L+O(1)\) and \(\sum_{i=1}^{\infty }\frac{\theta_{i}}{i}<\infty\), we get

$$\begin{aligned} \biggl\vert \frac{\mathbb{E}(\sum_{k=1}^{N}Z_{k}^{*})}{\log 2^{N+1}}-a \biggr\vert &\ll\frac{1}{N}\sum _{k=1}^{N}\frac{k}{(\log k)^{\beta}}\sum _{i=2^{k}+1}^{2^{k+1}} \frac{\theta_{i}}{i}+o_{N}(1) \\ &=O \bigl((\log N)^{-\beta} \bigr)+o_{N}(1) \\ &=o_{N}(1). \end{aligned}$$

Thus by (3.1), we get

$$\lim_{k \to\infty}\frac{\sum_{k=1}^{N_{k}}Z_{k}^{*}}{\log 2^{N_{k}+1}}=a\quad \mbox{a.s.} $$

Then by Lemma 2.3, we have

$$ \lim_{k \to\infty}\frac{\sum_{k=1}^{N_{k}}Z_{k}}{\log 2^{N_{k}+1}}=a\quad \mbox{a.s.} $$
(3.4)

The relation \(\lambda<1\) implies \(\lim_{k \to\infty}N_{k+1}/N_{k}=1\), thus (3.4) and the positivity of the \(Z_{k}\) yield

$$\lim_{N \to\infty}\frac{\sum_{k=1}^{N}Z_{k}}{\log 2^{N+1}}=a\quad \mbox{a.s.}, $$

i.e. (1.6) holds for the subsequence \(N=2^{k}\). Using again the positivity of the terms, we get (1.6). We complete the proof of Theorem 1.1.

References

  1. Rempala, G, Wesolowski, J: Asymptotics for products of sums and U-statistics. Electron. Commun. Probab. 7, 47-54 (2002)

    Article  MathSciNet  Google Scholar 

  2. Gonchigdanzan, K, Rempala, G: A note on the almost sure limit theorem for the product of partial sums. Appl. Math. Lett. 19, 191-196 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  3. Tan, ZQ, Peng, ZX: Almost sure central limit theorem for the product of partial sums. Acta Math. Sci. 29, 1689-1698 (2009)

    MathSciNet  MATH  Google Scholar 

  4. Zhang, Y, Yang, XY, Dong, ZS: An almost sure central limit theorem for products of sums of partial sums under association. J. Math. Anal. Appl. 355, 708-716 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  5. Berkes, I, Csáki, E, Horváth, L: Almost sure central limit theorems under minimal conditions. Stat. Probab. Lett. 37, 67-76 (1998)

    Article  MATH  Google Scholar 

  6. Wu, QY: Almost sure central limit theory for products of sums of partial sums. Appl. Math. J. Chin. Univ. Ser. B 27, 169-180 (2012)

    Article  MATH  Google Scholar 

  7. Petrov, V: Sums of Independent Random Variables. Springer, New York (1975)

    Book  Google Scholar 

  8. Feller, W: The law of iterated logarithm for identically distributed random variables. Ann. Math. 47, 631-638 (1946)

    Article  MathSciNet  MATH  Google Scholar 

  9. Friedman, N, Katz, M, Koopmans, LH: Convergence rates for the central limit theorem. Proc. Natl. Acad. Sci. USA 56, 1062-1065 (1966)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the editor and the referees for their very valuable comments by which the quality of the paper has been improved. This research is supported by the National Natural Science Foundation of China (71271042) and the Guangxi China Science Foundation (2013GXNSFAA278003). It is also supported by the Research Project of Guangxi High Institution (YB2014150).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fengxiang Feng.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

FF conceived of the study and drafted and completed the manuscript. DW participated in the discussion of the manuscript. FF and DW read and approved the final manuscript.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Feng, F., Wang, D. Almost sure central limit theorem for products of sums of partial sums. J Inequal Appl 2016, 49 (2016). https://doi.org/10.1186/s13660-016-0995-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13660-016-0995-2

MSC

Keywords