# A note on the almost sure central limit theorem for the product of some partial sums

## Abstract

Let $\left({X}_{n}\right)$ be a sequence of i.i.d., positive, square integrable random variables with $E\left({X}_{1}\right)=\mu >0$, $Var\left({X}_{1}\right)={\sigma }^{2}$. Denote by ${S}_{n,k}={\sum }_{i=1}^{n}{X}_{i}-{X}_{k}$ and by $\gamma =\sigma /\mu$ the coefficient of variation. Our goal is to show the unbounded, measurable functions g, which satisfy the almost sure central limit theorem, i.e.,

$\underset{N\to \mathrm{\infty }}{lim}\frac{1}{logN}\sum _{n=1}^{N}\frac{1}{n}g\left({\left(\frac{{\prod }_{k=1}^{n}{S}_{n,k}}{{\left(n-1\right)}^{n}{\mu }^{n}}\right)}^{\frac{1}{\gamma \sqrt{n}}}\right)={\int }_{0}^{\mathrm{\infty }}g\left(x\right)\phantom{\rule{0.2em}{0ex}}dF\left(x\right)\phantom{\rule{1em}{0ex}}\text{a.s.},$

where $F\left(\cdot \right)$ is the distribution function of the random variable ${e}^{\mathcal{N}}$ and is a standard normal random variable.

MSC:60F15, 60F05.

## 1 Introduction

The almost sure central limit theorem (ASCLT) has been first introduced independently by Schatte  and Brosamler . Since then, many studies have been done to prove the ASCLT in different situations, for example, in the case of function-typed almost sure central limit theorem (FASCLT) (see Berkes et al. , Ibragimov and Lifshits ). The purpose of this paper is to investigate the FASCLT for the product of some partial sums.

Let $\left({X}_{n}\right)$ be a sequence of i.i.d. random variables and define the partial sum ${S}_{n}={\sum }_{k=1}^{n}{X}_{k}$ for $n\ge 1$. In a recent paper of Rempala and Wesolowski , it is showed under the assumption $E\left({X}^{2}\right)<\mathrm{\infty }$ and $X>0$ that

${\left(\frac{{\prod }_{k=1}^{n}{S}_{k}}{n!{\mu }^{n}}\right)}^{\frac{1}{\gamma \sqrt{n}}}\stackrel{d}{\to }{e}^{\sqrt{2}\mathcal{N}},$
(1)

where is a standard normal random variable, $\mu =E\left(X\right)$ and $\gamma =\sigma /\mu$ with ${\sigma }^{2}=var\left(X\right)$. For further results in this field, we refer to Qi , Lu and Qi  and Rempala and Wesolowski .

Recently Gonchigdanzan and Rempala  obtained the almost sure limit theorem related to (1) as follows.

Theorem A Let $\left({X}_{n}\right)$ be a sequence of i.i.d., positive random variables with $E\left({X}_{1}\right)=\mu >0$ and $Var\left({X}_{1}\right)={\sigma }^{2}$. Denote by $\gamma =\sigma /\mu$ the coefficient of variation. Then, for any real x,

$\underset{N\to \mathrm{\infty }}{lim}\frac{1}{logN}\sum _{n=1}^{N}\frac{1}{n}I\left({\left(\frac{{\prod }_{k=1}^{n}{S}_{k}}{n!{\mu }^{n}}\right)}^{\frac{1}{\gamma \sqrt{n}}}\le x\right)=G\left(x\right)\phantom{\rule{1em}{0ex}}\mathit{\text{a.s.}},$
(2)

where $G\left(x\right)$ is the distribution function of ${e}^{\sqrt{2}\mathcal{N}}$, is a standard normal random variable. Some extensions on the above result can be found in Ye and Wu and the reference therein.

A similar result on the product of partial sums was provided by Miao , which stated the following.

Theorem B Let $\left({X}_{n}\right)$ be a sequence of i.i.d., positive, square integrable random variables with $E\left({X}_{1}\right)=\mu >0$ and $Var\left({X}_{1}\right)={\sigma }^{2}$. Denote by ${S}_{n,k}={\sum }_{i=1}^{n}{X}_{i}-{X}_{k}$ and $\gamma =\sigma /\mu$ the coefficient of variation. Then

${\left(\frac{{\prod }_{k=1}^{n}{S}_{n,k}}{{\left(n-1\right)}^{n}{\mu }^{n}}\right)}^{\frac{1}{\gamma \sqrt{n}}}\stackrel{d}{\to }{e}^{\mathcal{N}},$
(3)

and for any real x,

$\underset{N\to \mathrm{\infty }}{lim}\frac{1}{logN}\sum _{n=1}^{N}\frac{1}{n}I\left({\left(\frac{{\prod }_{k=1}^{n}{S}_{n,k}}{{\left(n-1\right)}^{n}{\mu }^{n}}\right)}^{\frac{1}{\gamma \sqrt{n}}}\le x\right)=F\left(x\right)\phantom{\rule{1em}{0ex}}\mathit{\text{a.s.}},$
(4)

where $F\left(\cdot \right)$ is the distribution function of the random variable ${e}^{\mathcal{N}}$ and is a standard normal random variable.

The purpose of this paper is to investigate the validity of (4) for some class of unbounded measurable functions g.

Throughout this article, $\left({X}_{n}\right)$ is a sequence of i.i.d. positive, square integrable random variables with $E\left({X}_{1}\right)=\mu >0$ and $Var\left({X}_{1}\right)={\sigma }^{2}$. We denote by ${S}_{n,k}={\sum }_{i=1}^{n}{X}_{i}-{X}_{k}$ and by $\gamma =\sigma /\mu$ the coefficient of variation. Furthermore, is the standard normal random variable, Φ is the standard normal distribution function, ϕ is its density function and $a\ll b$ stands for ${lim sup}_{n\to \mathrm{\infty }}|{a}_{n}/{b}_{n}|<\mathrm{\infty }$.

## 2 Main result

We state our main result as follows.

Theorem 1 Let $g\left(x\right)$ be a real-valued, almost everywhere continuous function on R such that $|g\left({e}^{x}\right)\varphi \left(x\right)|\le c{\left(1+|x|\right)}^{-\alpha }$ with some $c>0$ and $\alpha >5$. Then, for any real x,

$\underset{N\to \mathrm{\infty }}{lim}\frac{1}{logN}\sum _{n=1}^{N}\frac{1}{n}g\left({\left(\frac{{\prod }_{k=1}^{n}{S}_{n,k}}{{\left(n-1\right)}^{n}{\mu }^{n}}\right)}^{\frac{1}{\gamma \sqrt{n}}}\right)={\int }_{0}^{\mathrm{\infty }}g\left(x\right)\phantom{\rule{0.2em}{0ex}}dF\left(x\right)\phantom{\rule{1em}{0ex}}\mathit{\text{a.s.}},$
(5)

where $F\left(\cdot \right)$ is the distribution function of the random variable ${e}^{\mathcal{N}}$.

Let $f\left(x\right)=g\left({e}^{x}\right)$. By a simple calculation, we can get the following result.

Remark 1 Let $f\left(x\right)$ be a real-valued, almost everywhere continuous function on R such that $|f\left(x\right)\varphi \left(x\right)|\le c{\left(1+|x|\right)}^{-\alpha }$ with some $c>0$ and $\alpha >5$. Then (5) is equivalent to

$\underset{N\to \mathrm{\infty }}{lim}\frac{1}{logN}\sum _{n=1}^{N}\frac{1}{n}f\left(\frac{1}{\gamma \sqrt{n}}\sum _{k=1}^{n}log\frac{{S}_{n,k}}{\left(n-1\right)\mu }\right)={\int }_{-\mathrm{\infty }}^{\mathrm{\infty }}f\left(x\right)\varphi \left(x\right)\phantom{\rule{0.2em}{0ex}}dx\phantom{\rule{1em}{0ex}}\text{a.s.}$
(6)

Remark 2 Lu et al.  proved the function-typed almost sure central limit theorem for a type of random function, which can include U-statistics, Von-Mises statistics, linear processes and some other types of statistics, but their results cannot imply Theorem 1.

## 3 Auxiliary results

In this section, we state and prove several auxiliary results, which will be useful in the proof of Theorem 1.

Let ${\stackrel{˜}{S}}_{n}={\sum }_{i=1}^{n}\frac{{X}_{i}-\mu }{\sigma }$ and ${U}_{i}=\frac{1}{\gamma \sqrt{i}}{\sum }_{k=1}^{i}log\frac{{S}_{i,k}}{\left(i-1\right)\mu }$. Observe that for $|x|<1$ we have

$log\left(1+x\right)=x+\frac{\theta }{2}{x}^{2},$

where $\theta \in \left(-1,0\right)$. Thus

$\begin{array}{rcl}{U}_{i}& =& \frac{1}{\gamma \sqrt{i}}\sum _{k=1}^{i}log\frac{{S}_{i,k}}{\left(i-1\right)\mu }\\ =& \frac{1}{\gamma \sqrt{i}}\sum _{k=1}^{i}\left(\frac{{S}_{i,k}}{\left(i-1\right)\mu }-1\right)+\frac{1}{\gamma \sqrt{i}}\sum _{k=1}^{i}\frac{{\theta }_{k}}{2}{\left(\frac{{S}_{i,k}}{\left(i-1\right)\mu }-1\right)}^{2}\\ =& \frac{1}{\sqrt{i}}\sum _{k=1}^{i}\left(\frac{{\sum }_{j\ne k,j\le i}\left({X}_{j}-\mu \right)}{\left(i-1\right)\sigma }\right)+\frac{1}{\gamma \sqrt{i}}\sum _{k=1}^{i}\frac{{\theta }_{k}}{2}{\left(\frac{{S}_{i,k}}{\left(i-1\right)\mu }-1\right)}^{2}\\ =& \frac{1}{\sqrt{i}}\sum _{k=1}^{i}\frac{{X}_{k}-\mu }{\sigma }+\frac{1}{\gamma \sqrt{i}}\sum _{k=1}^{i}\frac{{\theta }_{k}}{2}{\left(\frac{{S}_{i,k}}{\left(i-1\right)\mu }-1\right)}^{2}\\ =:& \frac{1}{\sqrt{i}}{\stackrel{˜}{S}}_{i}+{R}_{i}.\end{array}$
(7)

By the law of iterated logarithm, we have for $k\to \mathrm{\infty }$

$\underset{1\le k\le i}{max}|\frac{{S}_{i,k}}{\left(i-1\right)\mu }-1|=O\left({\left(loglogi/i\right)}^{1/2}\right)\phantom{\rule{1em}{0ex}}\text{a.s.}$

Therefore,

$|{R}_{i}|=|\frac{1}{\gamma \sqrt{i}}\sum _{k=1}^{i}\frac{{\theta }_{k}}{2}{\left(\frac{{S}_{i,k}}{\left(i-1\right)\mu }-1\right)}^{2}|\ll \frac{1}{\sqrt{i}}\sum _{k=1}^{i}{\left(\frac{{S}_{i,k}}{\left(i-1\right)\mu }-1\right)}^{2}\ll \frac{loglogi}{{i}^{1/2}}\phantom{\rule{1em}{0ex}}\text{a.s.}$
(8)

Obviously,

$\begin{array}{rl}E|{R}_{i}|& =E|\frac{1}{\gamma \sqrt{i}}\sum _{k=1}^{i}\frac{{\theta }_{k}}{2}{\left(\frac{{S}_{i,k}}{\left(i-1\right)\mu }-1\right)}^{2}|\\ \ll \frac{1}{\sqrt{i}}\sum _{k=1}^{i}E{\left(\frac{{S}_{i,k}}{\left(i-1\right)\mu }-1\right)}^{2}\ll \frac{1}{\sqrt{i}}\sum _{k=1}^{i}\frac{1}{i-1}\ll \frac{1}{{i}^{1/2}}.\end{array}$
(9)

Our proof mainly relies on decomposition (7). Properties (8) and (9) will be extensively used in the following parts of this section.

Lemma 1 Let X and Y be random variables. We write $F\left(x\right)=P\left(X, $G\left(x\right)=P\left(X+Y. Then

$F\left(x-\epsilon \right)-P\left(|Y|\ge \epsilon \right)\le G\left(x\right)\le F\left(x+\epsilon \right)+P\left(|Y|\ge \epsilon \right)$

for every $\epsilon >0$ and x.

Proof It is Lemma 1.3 of Petrov . □

Lemma 2 Let $\left({X}_{n}\right)$ be a sequence of i.i.d. random variables. Let ${S}_{n}={\sum }_{k\le n}{X}_{k}$, ${F}^{s}$ denote the distribution function obtained from F by symmetrization, and choose $L>0$ so large that ${\int }_{|x|\le L}{x}^{2}\phantom{\rule{0.2em}{0ex}}d{F}^{s}\ge 1$. Then, for any $n\ge 1$, $\lambda >0$,

$\underset{a}{sup}P\left(a\le \frac{{S}_{n}}{\sqrt{n}}\le a+\lambda \right)\le A\lambda$

with some absolute constant A, provided $\lambda \sqrt{n}\ge L$.

Proof It can be obtained from Berkes et al. . □

Lemma 3 Assume that (6) is true for all indicator functions of intervals and for a fixed a.e. continuous function $f\left(x\right)={f}_{0}\left(x\right)$. Then (6) is also true for all a.e. continuous functions f such that $|f\left(x\right)|\le |{f}_{0}\left(x\right)|$, $x\in R$, and, moreover, the exceptional set of probability 0 can be chosen universally for all such f.

Proof See Berkes et al. . □

In view of Lemma 3 and Remark 1, in order to prove Theorem 1, it suffices to prove (6) for the case when $f\left(x\right)\varphi \left(x\right)={\left(1+|x|\right)}^{-\alpha }$, $\alpha >5$. Thus, in the following part, we put $f\left(x\right)\varphi \left(x\right)={\left(1+|x|\right)}^{-\alpha }$, $\alpha >5$ and

$\begin{array}{c}{\xi }_{k}=\sum _{i={2}^{k}+1}^{{2}^{k+1}}\frac{1}{i}f\left({U}_{i}\right),\hfill \\ {\xi }_{k}^{\ast }=\sum _{i={2}^{k}+1}^{{2}^{k+1}}\frac{1}{i}f\left({U}_{i}\right)I\left\{f\left({U}_{i}\right)\le \frac{k}{{\left(logk\right)}^{\beta }}\right\},\hfill \end{array}$

where $1<\beta <\frac{1}{2}\left(\alpha -3\right)$.

Lemma 4 Under the conditions of Theorem  1, we have .

Proof Let ${f}^{-1}$ denote an inverse function of f in some interval, and let α, β satisfy $1<\beta <\frac{1}{2}\left(\alpha -3\right)$. It is easy to check that

and

$\begin{array}{rcl}f\left({\left(2logk+\left(\alpha -2\beta \right)loglogk\right)}^{1/2}\right)& =& \frac{k}{{\left(logk\right)}^{\beta }}\frac{\sqrt{2\pi }{\left(logk\right)}^{\alpha /2}}{{\left\{1+{\left(2logk+\left(\alpha -2\beta \right)loglogk\right)}^{1/2}\right\}}^{\alpha }}\\ \le & \frac{k}{{\left(logk\right)}^{\beta }}.\end{array}$
(10)

Note that the function f is even and strictly increasing for $x\ge {x}_{0}$. We have

${f}^{-1}\left(k/{\left(logk\right)}^{\beta }\right)\ge {\left(2logk+\left(\alpha -2\beta \right)loglogk\right)}^{1/2}.$
(11)

Observing that ${2}^{k} implies $k\ge \frac{1}{2}logi$, in view of (8) we get

where in the last step we use the assumption $\alpha -2\beta >3$ and a version of the Kolmogorov-Erdös-Feller-Petrovski test (see Feller , Theorem 2). This completes the proof of Lemma 4. □

Let ${a}_{k}={f}^{-1}\left(k/{\left(logk\right)}^{\beta }\right)$ and let ${G}_{i}$ and ${F}_{i}$ denote, respectively, the distribution function of ${U}_{i}$ and $\frac{{\stackrel{˜}{S}}_{i}}{\sqrt{i}}$. Set

$\begin{array}{c}{\sigma }_{i}^{2}={\int }_{-\sqrt{i}}^{\sqrt{i}}{x}^{2}\phantom{\rule{0.2em}{0ex}}d{F}_{i}\left(x\right)-{\left({\int }_{-\sqrt{i}}^{\sqrt{i}}x\phantom{\rule{0.2em}{0ex}}d{F}_{i}\left(x\right)\right)}^{2},\hfill \\ {\eta }_{i}=\underset{x}{sup}|{G}_{i}\left(x\right)-\mathrm{\Phi }\left(\frac{x}{{\sigma }_{i}}\right)|,\hfill \\ {\epsilon }_{i}=\underset{x}{sup}|{F}_{i}\left(x\right)-\mathrm{\Phi }\left(\frac{x}{{\sigma }_{i}}\right)|.\hfill \end{array}$

Clearly, ${\sigma }_{i}\le 1$, ${lim}_{i\to \mathrm{\infty }}{\sigma }_{i}=1$.

Lemma 5 Under the conditions of Theorem  1, we have

$\sum _{k\le N}E{\left({\xi }_{k}^{\ast }\right)}^{2}\ll \frac{{N}^{2}}{{\left(logN\right)}^{2\beta }}.$

Proof Observe now that the relation

$|{\int }_{-a}^{a}\psi \left(x\right)\phantom{\rule{0.2em}{0ex}}d\left({G}_{1}\left(x\right)-{G}_{2}\left(x\right)\right)|\le \underset{-a\le x\le a}{sup}|\psi \left(x\right)|\cdot \underset{-a\le x\le a}{sup}|{G}_{1}\left(x\right)-{G}_{2}\left(x\right)|$
(12)

is valid for any bounded, measurable functions ψ and distribution functions ${G}_{1}$, ${G}_{2}$. Let, as previously, ${a}_{k}={f}^{-1}\left(k/{\left(logk\right)}^{\beta }\right)$. Thus, for any ${2}^{k}, we obtain that

$\begin{array}{rcl}E{f}^{2}\left({U}_{i}\right)I\left\{f\left({U}_{i}\right)\le \frac{k}{{\left(logk\right)}^{\beta }}\right\}& =& {\int }_{|x|\le {a}_{k}}{f}^{2}\left(x\right)\phantom{\rule{0.2em}{0ex}}d{G}_{i}\left(x\right)\\ \le & {\int }_{|x|\le {a}_{k}}{f}^{2}\left(x\right)\phantom{\rule{0.2em}{0ex}}d\mathrm{\Phi }\left(\frac{x}{{\sigma }_{i}}\right)+{\eta }_{i}\frac{{k}^{2}}{{\left(logk\right)}^{2\beta }}\\ \ll & {\int }_{|x|\le {a}_{k}}{f}^{2}\left(x\right)\phantom{\rule{0.2em}{0ex}}d\mathrm{\Phi }\left(x\right)+{\eta }_{i}\frac{{k}^{2}}{{\left(logk\right)}^{2\beta }},\end{array}$

where in the last step, we have used the fact that ${\sigma }_{i}\le 1$, ${lim}_{i\to \mathrm{\infty }}{\sigma }_{i}=1$. Hence, by the Cauchy-Schwarz inequality, we have

$\begin{array}{rcl}E{\left({\xi }_{k}^{\ast }\right)}^{2}& \ll & E{\left[{\left(\sum _{i={2}^{k}+1}^{{2}^{k+1}}{\left(\frac{1}{i}\right)}^{2}\right)}^{1/2}{\left(\sum _{i={2}^{k}+1}^{{2}^{k+1}}{f}^{2}\left({U}_{i}\right)I\left\{f\left({U}_{i}\right)\le \frac{k}{{\left(logk\right)}^{\beta }}\right\}\right)}^{1/2}\right]}^{2}\\ \ll & \left(\sum _{i={2}^{k}+1}^{{2}^{k+1}}\frac{1}{{i}^{2}}\right)\left(\sum _{i={2}^{k}+1}^{{2}^{k+1}}\left({\int }_{|x|\le {a}_{k}}{f}^{2}\left(x\right)\phantom{\rule{0.2em}{0ex}}d\mathrm{\Phi }\left(x\right)+{\eta }_{i}\frac{{k}^{2}}{{\left(logk\right)}^{2\beta }}\right)\right)\\ \ll & \frac{1}{{2}^{k}}\left({2}^{k}{\int }_{|x|\le {a}_{k}}{f}^{2}\left(x\right)\phantom{\rule{0.2em}{0ex}}d\mathrm{\Phi }\left(x\right)+\frac{{k}^{2}}{{\left(logk\right)}^{2\beta }}\sum _{i={2}^{k}+1}^{{2}^{k+1}}{\eta }_{i}\right)\\ \ll & {\int }_{|x|\le {a}_{k}}\frac{{e}^{{x}^{2}/2}}{{\left(1+|x|\right)}^{2\alpha }}\phantom{\rule{0.2em}{0ex}}dx+\frac{{k}^{2}}{{\left(logk\right)}^{2\beta }}\sum _{i={2}^{k}+1}^{{2}^{k+1}}\frac{{\eta }_{i}}{i}.\end{array}$
(13)

Note that

${\int }_{0}^{t}\frac{{e}^{{x}^{2}/2}}{{\left(1+|x|\right)}^{2\alpha }}\phantom{\rule{0.2em}{0ex}}dx={\int }_{0}^{t/2}+{\int }_{t/2}^{t}\ll t{e}^{{t}^{2}/8}+\frac{1}{{t}^{2\alpha +1}}{\int }_{t/2}^{t}x{e}^{{x}^{2}/2}\phantom{\rule{0.2em}{0ex}}dx\ll \frac{{e}^{{t}^{2}/2}}{{t}^{2\alpha +1}},$

and thus by (10) and (11), we have

${\int }_{|x|\le {a}_{k}}\frac{{e}^{{x}^{2}/2}}{{\left(1+|x|\right)}^{2\alpha }}\phantom{\rule{0.2em}{0ex}}dx\ll \frac{{e}^{{a}_{k}^{2}/2}}{{a}_{k}^{2\alpha +1}}\ll f\left({a}_{k}\right)\frac{1}{{a}_{k}^{\alpha +1}}\ll \frac{k}{{\left(logk\right)}^{\beta +\left(\alpha +1\right)/2}}.$
(14)

Now we estimate ${\eta }_{i}$. By Lemma 1, we have that for some $\epsilon >0$,

$\begin{array}{rcl}{\eta }_{i}& =& \underset{x}{sup}|{G}_{i}\left(x\right)-\mathrm{\Phi }\left(\frac{x}{{\sigma }_{i}}\right)|\\ \le & \underset{x}{sup}|{G}_{i}\left(x\right)-{F}_{i}\left(x\right)|+\underset{x}{sup}|{F}_{i}\left(x\right)-\mathrm{\Phi }\left(\frac{x}{{\sigma }_{i}}\right)|\\ =& \underset{x}{sup}|P\left({U}_{i}\le x\right)-P\left(\frac{{\stackrel{˜}{S}}_{i}}{\sqrt{i}}\le x\right)|+{\epsilon }_{i}\\ =& \underset{x}{sup}|P\left(\left(\frac{{\stackrel{˜}{S}}_{i}}{\sqrt{i}}+{R}_{i}\right)\le x\right)-P\left(\frac{{\stackrel{˜}{S}}_{i}}{\sqrt{i}}\le x\right)|+{\epsilon }_{i}\\ \le & P\left(|{R}_{i}|\ge \epsilon \right)+\underset{x}{sup}\left\{P\left(\frac{{\stackrel{˜}{S}}_{i}}{\sqrt{i}}\le x+\epsilon \right)-P\left(\frac{{\stackrel{˜}{S}}_{i}}{\sqrt{i}}\le x\right)\right\}+{\epsilon }_{i}.\end{array}$

The Markov inequality and (9) imply that

$P\left(|{R}_{i}|\ge \epsilon \right)\le \frac{E|{R}_{i}|}{\epsilon }\ll \frac{1}{{i}^{1/2}\epsilon }.$

$\underset{x}{sup}\left\{P\left(\frac{{\stackrel{˜}{S}}_{i}}{\sqrt{i}}\le x+\epsilon \right)-P\left(\frac{{\stackrel{˜}{S}}_{i}}{\sqrt{i}}\le x\right)\right\}\ll \epsilon .$

Setting $\epsilon ={i}^{-1/3}$, we have

${\eta }_{i}\ll \frac{1}{{i}^{1/6}}+\frac{1}{{i}^{1/3}}+{\epsilon }_{i}.$

Using Theorem 1 of Friedman et al. , we get

$\sum _{i=1}^{\mathrm{\infty }}\frac{{\epsilon }_{i}}{i}<\mathrm{\infty }.$

Hence,

$\sum _{i=1}^{\mathrm{\infty }}\frac{{\eta }_{i}}{i}\ll \sum _{i=1}^{\mathrm{\infty }}\frac{\frac{1}{{i}^{1/6}}+{\epsilon }_{i}}{i}<\mathrm{\infty },$
(15)

which, coupled with (13), (14) and the fact $\frac{1}{2}\left(\alpha +1\right)>\beta$, yields

$\begin{array}{rcl}\sum _{k\le N}E{\left({\xi }_{k}^{\ast }\right)}^{2}& \ll & \sum _{k\le N}\frac{k}{{\left(logk\right)}^{\beta +\left(\alpha +1\right)/2}}+\sum _{k\le N}\frac{{k}^{2}}{{\left(logk\right)}^{2\beta }}\sum _{i={2}^{k}+1}^{{2}^{k+1}}\frac{{\eta }_{i}}{i}\\ \ll & \frac{{N}^{2}}{{\left(logN\right)}^{2\beta }},\end{array}$

which completes the proof. □

Lemma 6 Let ${\xi }_{k}^{\ast }={\sum }_{i={2}^{k}+1}^{{2}^{k+1}}\frac{1}{i}f\left({U}_{i}\right)I\left\{f\left({U}_{i}\right)\le \frac{k}{{\left(logk\right)}^{\beta }}\right\}$, ${\xi }_{l}^{\ast }={\sum }_{i={2}^{l}+1}^{{2}^{l+1}}\frac{1}{i}f\left({U}_{i}\right)I\left\{f\left({U}_{i}\right)\le \frac{l}{{\left(logl\right)}^{\beta }}\right\}$. Under the conditions of Theorem  1, we have for $l\ge {l}_{0}$

$|cov\left({\xi }_{k}^{\ast },{\xi }_{l}^{\ast }\right)|\ll \frac{kl}{{\left(logk\right)}^{\beta }{\left(logl\right)}^{\beta }}{2}^{-\left(l-k-1\right)/4}.$

Proof We first show the following result, for any $1\le i\le \frac{j}{2}$ and real x, y,

$|P\left({U}_{i}\le x,{U}_{j}\le y\right)-P\left({U}_{i}\le x\right)P\left({U}_{j}\le y\right)|\ll {\left(\frac{i}{j}\right)}^{1/4}.$
(16)

Letting $\rho =\frac{i}{j}$, the Chebyshev inequality yields

$P\left(|\frac{{\stackrel{˜}{S}}_{i}}{\sqrt{j}}|\ge {\rho }^{1/4}\right)\le \frac{1}{j}{\rho }^{-1/2}E{|{\stackrel{˜}{S}}_{i}|}^{2}={\rho }^{1/2}.$
(17)

Using the Markov inequality and (9), we have

$P\left(|{R}_{j}|\ge {\rho }^{1/4}\right)\le \frac{E|{R}_{j}|}{{\rho }^{1/4}}\ll \frac{1}{{j}^{1/2}{\rho }^{1/4}}=\frac{1}{{j}^{1/4}{i}^{1/4}}\le {\rho }^{1/4}.$
(18)

It follows from Lemma 1, Lemma 2, (17), (18) and the positivity and independence of $\left({X}_{n}\right)$ that

$\begin{array}{r}P\left({U}_{i}\le x,{U}_{j}\le y\right)\\ \phantom{\rule{1em}{0ex}}=P\left({U}_{i}\le x,\frac{{\stackrel{˜}{S}}_{j}}{\sqrt{j}}+{R}_{j}\le y\right)\\ \phantom{\rule{1em}{0ex}}=P\left({U}_{i}\le x,\frac{{\stackrel{˜}{S}}_{i}}{\sqrt{j}}+\sqrt{1-\rho }\frac{{\stackrel{˜}{S}}_{j}-{\stackrel{˜}{S}}_{i}}{\sqrt{j-i}}+{R}_{j}\le y\right)\\ \phantom{\rule{1em}{0ex}}\ge P\left({U}_{i}\le x,\sqrt{1-\rho }\frac{{\stackrel{˜}{S}}_{j}-{\stackrel{˜}{S}}_{i}}{\sqrt{j-i}}\le y\right)\\ \phantom{\rule{2em}{0ex}}-P\left(y-2{\rho }^{1/4}\le \sqrt{1-\rho }\frac{{\stackrel{˜}{S}}_{j}-{\stackrel{˜}{S}}_{i}}{\sqrt{j-i}}\le y\right)-P\left(|\frac{{\stackrel{˜}{S}}_{i}}{\sqrt{j}}|\ge {\rho }^{1/4}\right)-P\left(|{R}_{j}|\ge {\rho }^{1/4}\right)\\ \phantom{\rule{1em}{0ex}}\ge P\left({U}_{i}\le x,\sqrt{1-\rho }\frac{{\stackrel{˜}{S}}_{j}-{\stackrel{˜}{S}}_{i}}{\sqrt{j-i}}\le y\right)-\left(4A+O\left(1\right)+1\right){\rho }^{1/4}\\ \phantom{\rule{1em}{0ex}}=P\left({U}_{i}\le x\right)P\left(\sqrt{1-\rho }\frac{{\stackrel{˜}{S}}_{j}-{\stackrel{˜}{S}}_{i}}{\sqrt{j-i}}\le y\right)-\left(4A+O\left(1\right)+1\right){\rho }^{1/4}.\end{array}$
(19)

We can obtain an analogous upper estimate for the first probability in (19) by the same way. Thus

$P\left({U}_{i}\le x,{U}_{j}\le y\right)=P\left({U}_{i}\le x\right)P\left(\sqrt{1-\rho }\frac{{\stackrel{˜}{S}}_{j}-{\stackrel{˜}{S}}_{i}}{\sqrt{j-i}}\le y\right)-\theta \left(4A+O\left(1\right)+1\right){\rho }^{1/4},$

where $|\theta |\le 1$. A similar argument yields

$P\left({U}_{i}\le x\right)P\left({U}_{j}\le y\right)=P\left({U}_{i}\le x\right)P\left(\sqrt{1-\rho }\frac{{\stackrel{˜}{S}}_{j}-{\stackrel{˜}{S}}_{i}}{\sqrt{j-i}}\le y\right)-{\theta }^{\prime }\left(4A+O\left(1\right)+1\right){\rho }^{1/4},$

where $|{\theta }^{\prime }|\le 1$, and (16) follows. Letting ${G}_{i,j}\left(x,y\right)$ denote the joint distribution function of ${U}_{i}$ and ${U}_{j}$, in view of (12), (16), we get for $l\ge {l}_{0}$

$\begin{array}{r}|cov\left(f\left({U}_{i}\right)I\left\{f\left({U}_{i}\right)\le \frac{k}{{\left(logk\right)}^{\beta }}\right\},f\left({U}_{j}\right)I\left\{f\left({U}_{j}\right)\le \frac{l}{{\left(logl\right)}^{\beta }}\right\}\right)|\\ \phantom{\rule{1em}{0ex}}=|{\int }_{|x|\le {a}_{k}}{\int }_{|y|\le {a}_{l}}f\left(x\right)f\left(y\right)\phantom{\rule{0.2em}{0ex}}d\left({G}_{i,j}\left(x,y\right)-{G}_{i}\left(x\right){G}_{j}\left(y\right)\right)|\\ \phantom{\rule{1em}{0ex}}\ll \frac{kl}{{\left(logk\right)}^{\beta }{\left(logl\right)}^{\beta }}{2}^{-\left(l-k-1\right)/4},\end{array}$

where the last relation follows from the facts that: f is strictly increasing for $x\ge {x}_{0}$, $f\left({a}_{i}\right)=\frac{i}{{\left(logi\right)}^{\beta }}$ and ${2}^{k}, ${2}^{l}. Thus

$|cov\left({\xi }_{k}^{\ast },{\xi }_{l}^{\ast }\right)|\ll \frac{kl}{{\left(logk\right)}^{\beta }{\left(logl\right)}^{\beta }}{2}^{-\left(l-k-1\right)/4}.$

□

Lemma 7 Under the conditions of Theorem  1, letting ${\zeta }_{k}={\xi }_{k}^{\ast }-E{\xi }_{k}^{\ast }$, we have

$E{\left({\zeta }_{1}+\cdots +{\zeta }_{N}\right)}^{2}=O\left(\frac{{N}^{2}}{{\left(logN\right)}^{2\beta -1}}\right),\phantom{\rule{1em}{0ex}}N\to \mathrm{\infty }.$

Proof By Lemma 6, we have

$|\underset{l-k>40logN}{\sum _{1\le k\le l\le N}}E\left({\zeta }_{k}{\zeta }_{l}\right)|\ll \frac{{N}^{2}}{{\left(logN\right)}^{2\beta }}{N}^{2}{2}^{-10logN}=o\left(1\right).$

On the other hand, letting $\parallel \cdot \parallel$ denote the ${L}_{2}$ norm, Lemma 5 and the Cauchy-Schwarz inequality imply

$\begin{array}{rcl}|\underset{l-k\le 40logN}{\sum _{1\le k\le l\le N}}E\left({\zeta }_{k}{\zeta }_{l}\right)|& \le & \underset{l-k\le 40logN}{\sum _{1\le k\le l\le N}}\parallel {\zeta }_{k}\parallel \parallel {\zeta }_{l}\parallel \\ \le & \underset{l-k\le 40logN}{\sum _{1\le k\le l\le N}}\parallel {\xi }_{k}^{\ast }\parallel \parallel {\xi }_{l}^{\ast }\parallel \\ =& \sum _{0\le j\le 40logN}\sum _{k=1}^{N-j}\parallel {\xi }_{k}^{\ast }\parallel \parallel {\xi }_{k+j}^{\ast }\parallel \\ \le & {\left(\sum _{k=1}^{N}{\parallel {\xi }_{k}^{\ast }\parallel }^{2}\right)}^{1/2}{\left(\sum _{l=1}^{N}{\parallel {\xi }_{l}^{\ast }\parallel }^{2}\right)}^{1/2}40logN\\ =& O\left(\frac{{N}^{2}}{{\left(logN\right)}^{2\beta -1}}\right),\end{array}$

and Lemma 7 is proved. □

## 4 Proof of the main result

We only prove the property in (6), since, in view of Remark 1, it is sufficient for the proof of Theorem 1.

Proof of Theorem 1 By Lemma 7 we have

$E{\left(\frac{{\zeta }_{1}+\cdots +{\zeta }_{N}}{N}\right)}^{2}=O\left({\left(logN\right)}^{1-2\beta }\right),$

and thus setting ${N}_{k}=\left[exp\left({k}^{\lambda }\right)\right]$ with ${\left(2\beta -1\right)}^{-1}<\lambda <1$, we get

$\sum _{k=1}^{\mathrm{\infty }}E{\left(\frac{{\zeta }_{1}+\cdots +{\zeta }_{{N}_{k}}}{{N}_{k}}\right)}^{2}<\mathrm{\infty },$

and therefore

$\underset{k\to \mathrm{\infty }}{lim}\frac{{\zeta }_{1}+\cdots +{\zeta }_{{N}_{k}}}{{N}_{k}}=0\phantom{\rule{1em}{0ex}}\text{a.s.}$
(20)

Observe now that for ${2}^{k} we have

$\begin{array}{rcl}Ef\left({U}_{i}\right)I\left\{f\left({U}_{i}\right)\le \frac{k}{{\left(logk\right)}^{\beta }}\right\}& =& {\int }_{|x|\le {a}_{k}}f\left(x\right)\phantom{\rule{0.2em}{0ex}}d{G}_{i}\left(x\right)\\ =& {\int }_{|x|\le {a}_{k}}f\left(x\right)\phantom{\rule{0.2em}{0ex}}d\mathrm{\Phi }\left(\frac{x}{{\sigma }_{i}}\right)+{\int }_{|x|\le {a}_{k}}f\left(x\right)\phantom{\rule{0.2em}{0ex}}d\left({G}_{i}\left(x\right)-\mathrm{\Phi }\left(\frac{x}{{\sigma }_{i}}\right)\right).\end{array}$

Put $m={\int }_{-\mathrm{\infty }}^{\mathrm{\infty }}f\left(x\right)\phantom{\rule{0.2em}{0ex}}d\mathrm{\Phi }\left(x\right)$. Since ${\sigma }_{i}\le 1$, ${lim}_{i\to \mathrm{\infty }}{\sigma }_{i}=1$ and ${a}_{k}\to \mathrm{\infty }$ as $k\to \mathrm{\infty }$, we have

$\underset{k\to \mathrm{\infty }}{lim}\underset{{2}^{k}

and thus, using (12), we get

$|Ef\left({U}_{i}\right)I\left\{f\left({U}_{i}\right)\le \frac{k}{{\left(logk\right)}^{\beta }}\right\}-m|\le \frac{k{\eta }_{i}}{{\left(logk\right)}^{\beta }}+{o}_{k}\left(1\right).$

Thus we have

$E{\xi }_{k}^{\ast }=m\sum _{i={2}^{k}+1}^{{2}^{k+1}}\frac{1}{i}+{\vartheta }_{k}\frac{k}{{\left(logk\right)}^{\beta }}\sum _{i={2}^{k}+1}^{{2}^{k+1}}\frac{{\eta }_{i}}{i}+{o}_{k}\left(1\right),\phantom{\rule{1em}{0ex}}|{\vartheta }_{k}|\le 1.$

Consequently, using the relation ${\sum }_{i\le L}1/i=logL+O\left(1\right)$ and (15), we conclude

$\begin{array}{rcl}|\frac{E\left({\xi }_{1}^{\ast }+\cdots +{\xi }_{N}^{\ast }\right)}{log{2}^{N+1}}-m|& \ll & \frac{1}{N}\sum _{k\le N}\frac{k}{{\left(logk\right)}^{\beta }}\sum _{i={2}^{k}+1}^{{2}^{k+1}}\frac{{\eta }_{i}}{i}+{o}_{N}\left(1\right)\\ =& O\left({\left(logN\right)}^{-\beta }\right)+{o}_{N}\left(1\right)={o}_{N}\left(1\right),\end{array}$

and thus (20) gives

$\underset{k\to \mathrm{\infty }}{lim}\frac{{\xi }_{1}^{\ast }+\cdots +{\xi }_{{N}_{k}}^{\ast }}{log{2}^{{N}_{k}+1}}=m\phantom{\rule{1em}{0ex}}\text{a.s.}$

By Lemma 4 this implies

$\underset{k\to \mathrm{\infty }}{lim}\frac{{\xi }_{1}+\cdots +{\xi }_{{N}_{k}}}{log{2}^{{N}_{k}+1}}=m\phantom{\rule{1em}{0ex}}\text{a.s.}$
(21)

The relation $\lambda <1$ implies ${lim}_{k\to \mathrm{\infty }}{N}_{k+1}/{N}_{k}=1$, and thus (21) and the positivity of ${\xi }_{k}$ yield

$\underset{N\to \mathrm{\infty }}{lim}\frac{{\xi }_{1}+\cdots +{\xi }_{N}}{log{2}^{N+1}}=m\phantom{\rule{1em}{0ex}}\text{a.s.},$
(22)

i.e., (6) holds for the subsequence $\left\{{2}^{N+1}\right\}$. Now, for each $N\ge 4$, there exists n, depending on N, such that ${2}^{n+1}\le N\le {2}^{n+2}$. Then

$\frac{{\xi }_{1}+{\xi }_{2}+\cdots +{\xi }_{n}}{log{2}^{n+1}}\le \frac{{\sum }_{i=1}^{N}\frac{1}{i}f\left({U}_{i}\right)}{logN}\frac{logN}{log{2}^{n+1}}\le \frac{{\xi }_{1}+{\xi }_{2}+\cdots +{\xi }_{n+2}}{log{2}^{n+2}}\frac{log{2}^{n+2}}{log{2}^{n+1}}$
(23)

by the positivity of each term of $\left({\xi }_{k}\right)$. Noting that $\left(n+1\right)log2\sim logN\sim \left(n+2\right)log2$ as $N\to \mathrm{\infty }$, we get (6) by (22) and (23). □

## References

1. Schatte P: On strong versions of the central limit theorem. Math. Nachr. 1988, 137: 249–256. 10.1002/mana.19881370117

2. Brosamler ZD: An almost everywhere central limit theorem. Math. Proc. Camb. Philos. Soc. 1988, 104: 561–574. 10.1017/S0305004100065750

3. Berkes I, Csáki E, Horváth L: Almost sure limit theorems under minimal conditions. Stat. Probab. Lett. 1998, 37: 67–76. 10.1016/S0167-7152(97)00101-6

4. Ibragimov I, Lifshits M: On the convergence of generalized moments in almost sure central limit theorem. Stat. Probab. Lett. 1998, 40: 343–351. 10.1016/S0167-7152(98)00134-5

5. Rempala G, Wesolowski J: Asymptotics for products of sums and U-statistics. Electron. Commun. Probab. 2002, 7: 47–54.

6. Qi Y: Limit distributions for products of sums. Stat. Probab. Lett. 2003, 62: 93–100. 10.1016/S0167-7152(02)00438-8

7. Lu X, Qi Y: A note on asymptotic distribution of products of sums. Stat. Probab. Lett. 2004, 68: 407–413. 10.1016/j.spl.2004.04.009

8. Rempala G, Wesolowski J: Asymptotics for products of independent sums with an application to Wishart determinants. Stat. Probab. Lett. 2005, 74: 129–138. 10.1016/j.spl.2005.04.034

9. Gonchigdanzan K, Rempala G: A note on the almost sure limit theorem for the product of partial sums. Appl. Math. Lett. 2006, 19: 191–196. 10.1016/j.aml.2005.06.002

10. Ye D, Wu Q: Almost sure central limit theorem of product of partial sums for strongly mixing. J. Inequal. Appl. 2011., 2011: Article ID 576301

11. Miao Y: Central limit theorem and almost sure central limit theorem for the product of some partial sums. Proc. Indian Acad. Sci. Math. Sci. 2008, 118: 289–294. 10.1007/s12044-008-0021-9

12. Lu C, Qiu J, Xu J: Almost sure central limit theorems for random functions. Sci. China Ser. A 2006, 49: 1788–1799. 10.1007/s11425-006-2021-5

13. Petrov V: Sums of Independent Random Variables. Springer, New York; 1975.

14. Feller W: The law of iterated logarithm for identically distributed random variables. Ann. Math. 1946, 47: 631–638. 10.2307/1969225

15. Friedman N, Katz M, Koopmans LH: Convergence rates for the central limit theorem. Proc. Natl. Acad. Sci. USA 1966, 56: 1062–1065. 10.1073/pnas.56.4.1062

## Acknowledgements

The authors wish to thank the editor and the referees for their very valuable comments by which the quality of the paper has been improved. The authors would also like to thank Professor Zuoxiang Peng for several discussions and suggestions. Research supported by the National Science Foundation of China (No. 11326175), the Natural Science Foundation of Zhejiang Province of China (No. LQ14A010012) and the Research Start-up Foundation of Jiaxing University (No. 70512021).

## Author information

Authors

### Corresponding author

Correspondence to Zhongquan Tan.

### Competing interests

The authors declare that they have no competing interests.

### Authors’ contributions

All authors read and approved the final manuscript.

## Rights and permissions 