# An almost sure central limit theorem of products of partial sums for ρ--mixing sequences

## Abstract

Let {X n , n ≥ 1} be a strictly stationary ρ--mixing sequence of positive random variables with EX1 = μ > 0 and Var(X1) = σ2 < ∞. Denote ${S}_{n}=\sum _{i=1}^{n}{X}_{i}$ and $\gamma =\frac{\sigma }{\mu }$ the coefficient of variation. Under suitable conditions, by the central limit theorem of weighted sums and the moment inequality we show that

$\forall x=\underset{n\to \infty }{\text{lim}}\frac{1}{\text{log}n}\sum _{k=1}^{n}\frac{1}{k}I\left\{{\left(\prod _{i=1}^{k}\frac{{S}_{i}}{i\mu }\right)}^{\frac{1}{\gamma \sigma k}}\right\}=F\left(x\right)\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.,$

where ${\sigma }_{k}^{2}=Var\left({S}_{k,k}\right),\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}{S}_{k,k}=\sum _{i=1}^{k}{b}_{i,k}{Y}_{i},\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}{b}_{i,k}=\sum _{j=i}^{k}\frac{1}{j},\phantom{\rule{2.77695pt}{0ex}}i\le k$ with ${b}_{i,k}=0,i>k,\phantom{\rule{2.77695pt}{0ex}}{Y}_{i}=\frac{{X}_{i-\mu }}{\sigma },F\left(x\right)$ is the distribution function of the random variable ${e}^{\sqrt{2}\mathcal{N}}$, and $\mathcal{N}$ is a standard normal random variable.

MR(2000) Subject Classification: 60F15.

## 1 Introduction and main results

For a random variable X, define X p = (E|X|p)1/p. For two nonempty disjoint sets S,T N, we define dist(S,T) to be min{|j - k|; j S, k T}. Let σ(S) be the σ-field generated by {X k , k S}, and define σ(T) similarly. Let $C$ be a class of functions which are coordinatewise increasing. For any real number x, x+, and x- denote its positive and negative part, respectively, (except for some special definitions, for examples, ρ-(s), ρ-(S,T), etc.). For random variables X, Y, define

${\rho }^{-}\left(X,Y\right)=0\vee \text{sup}\frac{Cov\left(f\left(X\right),g\left(Y\right)\right)}{{\left(Varf\left(X\right)\right)}^{\frac{1}{2}}{\left(Varg\left(Y\right)\right)}^{\frac{1}{2}}},$

where the sup is taken over all $f,g\in C$ such that E(f(X))2 < ∞ and E(g(Y))2 < ∞.

A sequence {X n , n ≥ 1} is called negatively associated (NA) if for every pair of disjoint subsets S, T of N,

$Cov\left\{f\left({X}_{i},i\in S\right),g\left({X}_{j},j\in T\right)\right\}\le 0,$

whenever $f,g\in C$.

A sequence {X n , n ≥ 1} is called ρ*-mixing if

$\rho *\left(s\right)=\text{sup}\left\{\rho \left(S,T\right);S,T\subset N,\text{dist}\left(S,T\right)\ge s\right\}\to 0\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}s\to \infty ,$

where

$\rho \left(S,T\right)=\text{sup}\left\{\left|E\left(f-Ef\right)\left(g-Eg\right)/\left({∥f-Ef∥}_{2}\cdot {∥g-Eg∥}_{2}\right)\right|;f\in {L}_{2}\left(\sigma \left(S\right)\right),g\in {L}_{2}\left(\sigma \left(T\right)\right)\right\}.$

A sequence {X n , n ≥ 1} is called ρ--mixing, if

${\rho }^{\text{_}}\left(s\right)=\text{sup}\left\{{\rho }^{\text{_}}\left(S,T\right);S,T\subset N,\text{dist}\left(S,T\right)\ge s\right\}\to 0\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{0.3em}{0ex}}s\to \infty .$

where,

${\rho }^{\text{_}}\left(S,T\right)=0\vee \text{sup}\left\{\frac{Cov\left\{f\left({X}_{i},i\in S\right),g\left({X}_{i},j\in T\right)\right\}}{\sqrt{Var\left\{f\left({X}_{i},i\in S\right)\right\}Var\left\{g\left({X}_{i},j\in T\right)\right\}}};f,g\in C\right\}.$

The concept of ρ--mixing random variables was proposed in 1999 (see [1]). Obviously, ρ--mixing random variables include NA and ρ*-mixing random variables, which have a lot of applications, their limit properties have aroused wide interest recently, and a lot of results have been obtained, such as the weak convergence theorems, the central limit theorems of random fields, Rosenthal-type moment inequality, see [14]. Zhou [5] studied the almost sure central limit theorem of ρ--mixing sequences by the conditions provided by Shao: on the conditions of central limit theorem, and if ${\epsilon }_{0}>0,\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}Var\left(\sum _{i=1}^{n}\frac{1}{i}f\left(\frac{{S}_{i}}{{\sigma }_{i}}\right)\right)=O\left({\text{log}}^{2-{\epsilon }_{0}}n\right)$, where f is Lipschitz function. In this article, we study the almost sure central limit theorem of products of partial sums for ρ--mixing sequence by the central limit theorem of weighted sums and moment inequality.

Here and in the sequel, let ${b}_{k,n}=\sum _{i=k}^{n}\frac{1}{i},k\le n$ with bk,n= 0, k > n. Suppose {X n , n ≥ 1} be a strictly stationary ρ--mixing sequence of positive random variables with EX1 = μ > 0 and Var(X1) = σ2 < ∞. Let ${\stackrel{̃}{S}}_{n}=\sum _{k=1}^{n}{Y}_{k}$ and ${S}_{n,n}=\sum _{k=1}^{n}{b}_{k,n}{Y}_{k}$, where ${Y}_{k}=\frac{{X}_{k}-\mu }{\sigma },k\ge 1$. Let ${\sigma }_{n}^{2}=Var\left({S}_{n,n}\right)$, and C denotes a positive constant, which may take different values whenever it appears in different expressions. The following are our main results.

Theorem 1.1 Let {X n , n ≥ 1} be a defined as above with 0 < E|X1|r< ∞ for a certain r > 2, denote ${S}_{n}=\sum _{i=1}^{n}{X}_{i}$ and $\gamma =\frac{\sigma }{\mu }$ the coefficient of variation. Assume that

(a1) ${\sigma }_{1}^{2}=E{X}_{1}^{2}+2\sum _{n=2}^{\infty }Cov\left({X}_{1},{X}_{n}\right)>0$,

(a2) $\sum _{n=2}^{\infty }\left|Cov\left({X}_{1},{X}_{n}\right)\right|<\infty$,

(a3) ρ-(n) = O(log-δn), δ > 1,

(a4) $\underset{n\in N}{\text{inf}}\frac{{\sigma }_{n}^{2}}{n}>0$.

Then

$\forall x\phantom{\rule{1em}{0ex}}\underset{n\to \infty }{\text{lim}}\frac{1}{\text{log}n}\sum _{k=1}^{n}\frac{1}{k}I\left\{\frac{{S}_{k,k}}{{\sigma }_{k}}\le x\right\}=\Phi \left(x\right)\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.$
(1.1)

Here and in the sequel, I{·} denotes indicator function and Φ(·) is the distribution function of standard normal random variable $\mathcal{N}$.

Theorem 1.2 Under the conditions of Theorem 1.1, then

$\forall x=\underset{n\to \infty }{\text{lim}}\frac{1}{\text{log}n}\sum _{k=1}^{n}\frac{1}{k}I\left\{{\left(\prod _{i=1}^{k}\frac{{S}_{i}}{i\mu }\right)}^{\frac{1}{\gamma \sigma k}}\le x\right\}=F\left(x\right)\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.$
(1.2)

Here and in the sequel, F(·) is the distribution function of the random variable ${e}^{\sqrt{2}\mathcal{N}}$.

## 2 Some lemmas

To prove our main results, we need the following lemmas.

Lemma 2.1 [3] Let {X n , n ≥ 1} be a weakly stationary ρ--mixing sequence with $E{X}_{n}=0,\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}0, and

1. (i)

${\sigma }_{1}^{2}=E{X}_{1}^{2}+2\sum _{n=2}^{\infty }Cov\left({X}_{1},{X}_{n}\right)>0$,

2. (ii)

$\sum _{n=2}^{\infty }\left|Cov\left({X}_{1},{X}_{n}\right)\right|<\infty$,

then

$\frac{E{S}_{n}^{2}}{n}\to {{\sigma }_{1}}^{2},\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}\frac{{S}_{n}}{{\sigma }_{1}\sqrt{n}}\stackrel{d}{\to }\mathcal{N}\left(0,1\right)\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}n\to \infty .$

Lemma 2.2 [4] For a positive real number q ≥ 2, if {X n , n ≥ 1} is a sequence of ρ--mixing random variables with EX i = 0, E|X i |q< ∞ for every i ≥ 1, then for all n ≥ 1, there is a positive constant C = C(q, ρ-(·)) such that

$E\left(\underset{1\le j\le n}{\text{max}}{\left|{S}_{j}\right|}^{q}\right)\le C\left\{\sum _{i=1}^{n}E{\left|{X}_{i}\right|}^{q}+{\left(\sum _{i=1}^{n}E{X}_{i}^{2}\right)}^{\frac{q}{2}}\right\}.$

Lemma 2.3 [6] $\sum _{i=1}^{n}{b}_{i,n}^{2}=2n-{b}_{1,n}$.

Lemma 2.4 [[3], Theorem 3.2] Let {X ni , 1 ≤ in, n ≥ 1} be an array of centered random variables with $E{X}_{ni}^{2}<\infty$ for each i = 1,2,...,n. Assume that they are ρ--mixing. Let {a ni , 1 ≤ in, n ≥ 1} be an array of real numbers with a ni = ±1 for i = 1, 2,..., n. Denote ${\sigma }_{n}^{2}=Var\left(\sum _{i=1}^{n}{a}_{ni}{X}_{ni}\right)$ and suppose that

$\underset{n}{\text{sup}}\frac{1}{{\sigma }_{n}^{2}}\sum _{i=1}^{n}E{X}_{ni}^{2}<\infty ,$

and

$\underset{n\to \infty }{\mathrm{lim}\mathrm{sup}}\frac{1}{{\sigma }_{n}^{2}}\sum _{\underset{1\le i,j\le n}{i,j:|i-j|\ge A}}Cov{\left({X}_{ni},{X}_{nj}\right)}^{-}\to 0asA\to \infty ,$

and the following Lindeberg condition is satisfied:

$\frac{1}{{\sigma }_{n}^{2}}\sum _{i=1}^{n}E{X}_{ni}^{2}I\left\{\left|{X}_{ni}\right|\ge \epsilon {\sigma }_{n}\right\}\to 0\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}n\to \infty$

for every ε > 0. Then

$\frac{1}{{\sigma }_{n}}\sum _{i=1}^{n}{a}_{ni}{X}_{ni}\stackrel{d}{\to }\mathcal{N}\left(0,1\right)\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}n\to \infty .$

Lemma 2.5 Let {X n , n ≥ 1} be a strictly stationary sequence of ρ--mixing random variables with EX n = 0 and $\sum _{n=2}^{\infty }\left|Cov\left({X}_{1},{X}_{n}\right)\right|<\infty ,\phantom{\rule{2.77695pt}{0ex}}\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ be an array of real numbers such that $\underset{n}{\text{sup}}\sum _{i=1}^{n}{a}_{ni}^{2}<\infty$ and $\underset{1\le i\le n}{\text{max}}\left|{a}_{ni}\right|\to 0$ as n → ∞. If $Var\left(\sum _{i=1}^{n}{a}_{ni}{X}_{i}\right)=1$ and $\left\{{X}_{n}^{2}\right\}$ is an uniformly integrable family, then

$\sum _{i=1}^{n}{a}_{ni}{X}_{i}\stackrel{d}{\to }\mathcal{N}\left(0,1\right)\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}n\to \infty .$

Proof Notice that

$\sum _{i=1}^{n}{a}_{ni}{X}_{i}=\sum _{i=1}^{n}\frac{{a}_{ni}}{\left|{a}_{ni}\right|}\left|{a}_{ni}\right|{X}_{i}=:\sum _{i=1}^{n}{b}_{ni}{Y}_{ni},$

where ${b}_{ni}=\frac{{a}_{ni}}{\left|{a}_{ni}\right|}$ and Y ni = |a ni |X i . Then {Y ni , 1 ≤ in, n ≥ 1} is an array of ρ--mixing centered random variables with $E{Y}_{ni}^{2}={a}_{ni}^{2}E{X}_{i}^{2}<\infty$ and b ni = ±1 for i = 1, 2,..., n and ${\sigma }_{n}^{2}=Var\left(\sum _{i=1}^{n}{b}_{ni}{Y}_{ni}\right)=1$. Note that $\left\{{X}_{n}^{2}\right\}$ is an uniformly integrable family, we have

$\underset{n}{\text{sup}}\frac{1}{{\sigma }_{n}^{2}}\sum _{i=1}^{n}E{Y}_{ni}^{2}=\underset{n}{\text{sup}}\sum _{i=1}^{n}{a}_{ni}^{2}E{X}_{i}^{2}\le \underset{n}{\text{sup}}\sum _{i=1}^{n}{a}_{ni}^{2}\cdot \underset{i}{\text{sup}}E{X}_{i}^{2}<\infty ,$

and

and ε > 0, we get

$\begin{array}{l}\phantom{\rule{1em}{0ex}}\frac{1}{{\sigma }_{n}^{2}}\sum _{i=1}^{n}E{Y}_{ni}^{2}I\left\{\left|{Y}_{ni}\right|\ge \epsilon {\sigma }_{n}\right\}\phantom{\rule{2em}{0ex}}\\ =\sum _{i=1}^{n}{a}_{ni}^{2}E{X}_{i}^{2}I\left\{\left|{a}_{ni}\right|\cdot \left|{X}_{i}\right|\ge \epsilon \right\}\phantom{\rule{2em}{0ex}}\\ \le \underset{n}{\text{sup}}\sum _{i=1}^{n}{a}_{ni}^{2}\cdot E{X}_{1}^{2}I\left\{\left|{a}_{ni}\right|\cdot \left|{X}_{1}\right|\ge \epsilon \right\}\phantom{\rule{2em}{0ex}}\\ \le \underset{n}{\text{sup}}\sum _{i=1}^{n}{a}_{ni}^{2}\cdot E{X}_{1}^{2}I\left\{\underset{1\le i\le n}{\text{max}}\left|{a}_{ni}\right|\cdot \left|{X}_{1}\right|\ge \epsilon \right\}\to 0\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}n\to \infty ,\phantom{\rule{2em}{0ex}}\end{array}$

thus the conclusion is proved by Lemma 2.4.

Lemma 2.6 [2] Suppose that f1(x) and f2(y) are real, bounded, absolutely continuous functions on R with $\left|{{f}^{\prime }}_{1}\left(x\right)\right|\le {C}_{1}$ and $\left|{{f}^{\prime }}_{2}\left(y\right)\right|\le {C}_{2}$. Then for any random variables X and Y,

$\left|Cov\left({f}_{1}\left(X\right),{f}_{2}\left(Y\right)\right)\right|\le {C}_{1}{C}_{2}\left\{-Cov\left(X,Y\right)+8{p}^{-}\left(X,Y\right){∥X∥}_{2,1}{∥Y∥}_{2,1}\right\},$

where ${∥X∥}_{2,1}={\int }_{0}^{\infty }{P}^{\frac{1}{2}}\left(\left|X\right|>x\right)dx$.

Lemma 2.7 Let {X n , n ≥ 1} be a strictly stationary sequence of ρ--mixing random variables with $E{X}_{1}=0,\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}0 and

$\begin{array}{ll}\hfill 0<{\sigma }_{1}^{2}& =E{X}_{1}^{2}+2\sum _{n=2}^{\infty }Cov\left({X}_{1},{X}_{n}\right)<\infty ,\phantom{\rule{2em}{0ex}}\\ \phantom{\rule{1em}{0ex}}\phantom{\rule{1em}{0ex}}\sum _{n=2}^{\infty }\left|Cov\left({X}_{1},{X}_{n}\right)\right|<\infty ,\phantom{\rule{2em}{0ex}}\end{array}$

then for 0 < p < 2, we have

${n}^{-\frac{1}{p}}{S}_{n}\to 0\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}n\to \infty .$

Proof By Lemma 2.1, we have

$\underset{n\to \infty }{\text{lim}}\frac{E{S}_{n}^{2}}{n}={\sigma }_{1}^{2}.$
(2.1)

Let n k = kα, where $\alpha >\text{max}\left\{1,\frac{p}{2-p}\right\}$. By (2.1), we get

$\sum _{k=1}^{\infty }P\left\{\left|{S}_{nk}\right|\ge \epsilon {n}_{k}^{\frac{1}{p}}\right\}\le \sum _{k=1}^{\infty }\frac{E{S}_{{n}_{k}}^{2}}{{\epsilon }^{2}{n}_{k}^{\frac{2}{p}}}\le \sum _{k=1}^{\infty }\frac{C}{{\epsilon }^{2}{k}^{\alpha \left(\frac{2}{p}-1\right)}}<\infty .$

From Borel-Cantelli lemma, it follows that

${n}_{k}^{-\frac{1}{p}}{S}_{{n}_{k}}\to 0\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}k\to \infty .$
(2.2)

And by Lemma 2.2, it follows that

$\begin{array}{l}\phantom{\rule{1em}{0ex}}\sum _{k=1}^{\infty }P\left\{\underset{{n}_{k}\le n<{n}_{k+1}}{\text{max}}\frac{\left|{S}_{n}-{S}_{{n}_{k}}\right|}{{n}^{\frac{1}{p}}}\ge \epsilon \right\}\le \sum _{k=1}^{\infty }\frac{E\underset{{n}_{k}\le n<{n}_{k+1}}{\text{max}}{\left|{S}_{n}-{S}_{{n}_{k}}\right|}^{2}}{{\epsilon }^{2}{n}_{k}^{\frac{2}{p}}}\phantom{\rule{2em}{0ex}}\\ =\sum _{k=1}^{\infty }\frac{E\underset{{n}_{k}\le n<{n}_{k+1}}{\text{max}}{\left|\sum _{i={n}_{k+1}}^{n}{X}_{i}\right|}^{2}}{{\epsilon }^{2}{n}_{k}^{\frac{2}{p}}}\le C\sum _{k=1}^{\infty }\frac{\left({n}_{k+1}-{n}_{k}\right)}{{\epsilon }^{2}{n}_{k}^{\frac{2}{p}}}\le C\sum _{k=1}^{\infty }\frac{1}{{k}^{\alpha \left(\frac{2}{p}-1\right)}}<\infty .\phantom{\rule{2em}{0ex}}\end{array}$

By Borel-Cantelli lemma, we conclude that

$\underset{{n}_{k}\le n<{n}_{k+1}}{\text{max}}\frac{\left|{S}_{n}-{S}_{{n}_{k}}\right|}{{n}^{\frac{1}{p}}}\to 0\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}n\to \infty .$
(2.3)

For every n, there exist n k and nk+1such that n k n < nk+1, by (2.2) and (2.3), we have

$\begin{array}{ll}\hfill \frac{\left|{S}_{n}\right|}{{n}^{\frac{1}{p}}}& =\frac{\left|{S}_{n}-{S}_{{n}_{k}}+{S}_{{n}_{k}}\right|}{{n}^{\frac{1}{p}}}\phantom{\rule{2em}{0ex}}\\ \le \frac{\left|{S}_{{n}_{k}}\right|}{{n}_{k}^{\frac{1}{p}}}+\underset{{n}_{k}\le n<{n}_{k+1}}{\text{max}}\frac{\left|{S}_{n}-{S}_{{n}_{k}}\right|}{{n}^{\frac{1}{p}}}\to 0\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}n\to \infty .\phantom{\rule{2em}{0ex}}\end{array}$

The proof is now completed.

## 3 Proof of the theorems

Proof of Theorem 1.1 By the property of ρ--mixing sequence, it is easy to see that {Y n } is a strictly stationary ρ--mixing sequence with EY1 = 0 and $E{Y}_{1}^{2}=1$. We first prove

$\frac{{S}_{n,n}}{{\sigma }_{n}}\stackrel{d}{\to }\mathcal{N}\left(0,1\right)\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}n\to \infty .$
(3.1)

Let ${a}_{ni}=\frac{{b}_{i,n}}{{\sigma }_{n}},1\le i\le n,n\ge 1$. Obviously,

$Var\left(\sum _{i=1}^{n}{a}_{ni}{Y}_{i}\right)=1.$

From condition (a4) in Theorem 1.1 and Lemma 2.3, we have

$\underset{n}{\text{sup}}\sum _{i=1}^{n}{a}_{ni}^{2}=\underset{n}{\text{sup}}\sum _{i=1}^{n}\frac{{b}_{i,n}^{2}}{{\sigma }_{n}^{2}}=\underset{n}{\text{sup}}\frac{2n-{b}_{1,n}}{{\sigma }_{n}^{2}}\le C\underset{n}{\text{sup}}\frac{2n-{b}_{1,n}}{n}<\infty ,$

and

$\underset{1\le i\le n}{\text{max}}\left|{a}_{ni}\right|=\underset{1\le i\le n}{\text{max}}\frac{{b}_{i,n}}{{\sigma }_{n}}\le \frac{{b}_{1,n}}{{\sigma }_{n}}\le \frac{C\text{log}n}{\sqrt{n}}\to 0\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}n\to \infty .$

By stationarity of {Y n , n ≥ 1} and E |X1|2 < ∞, we know that $\left\{{Y}_{n}^{2}\right\}$ is uniformly integrable, and from condition (a2) in Theorem 1.1, we get $\sum _{n=2}^{\infty }\left|Cov\left({Y}_{1},{Y}_{n}\right)\right|<\infty$, so applying Lemma 2.5, we have

$\sum _{i=1}^{n}{a}_{ni}{Y}_{i}\stackrel{d}{\to }\mathcal{N}\left(0,1\right).$

Notice that

$\sum _{i=1}^{n}{a}_{ni}{Y}_{i}=\sum _{i=1}^{n}\frac{{b}_{i,n}{Y}_{i}}{{\sigma }_{n}}=\frac{{S}_{n,n}}{{\sigma }_{n}},$

so (3.1) is valid. Let f(x) be a bounded Lipschitz function and have a Radon-Nikodyn derivative h(x) bounded by Γ. From (3.1), we have

$Ef\left(\frac{{S}_{n,n}}{{\sigma }_{n}}\right)\to Ef\left(\mathcal{N}\left(0,1\right)\right)\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}n\to \infty ,$

thus

$\frac{1}{\text{log}n}\sum _{k=1}^{n}\frac{1}{k}Ef\left(\frac{{S}_{k,k}}{{\sigma }_{k}}\right)-Ef\left(\mathcal{N}\left(0,1\right)\right)\to 0\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}n\to \infty .$
(3.2)

On the other hand, note that (1.1) is equivalent to

$\underset{n\to \infty }{\text{lim}}\frac{1}{\text{log}n}\sum _{k=1}^{n}\frac{1}{k}f\left(\frac{{S}_{k,k}}{{\sigma }_{k}}\right)={\int }_{-\infty }^{\infty }f\left(x\right)d\Phi \left(x\right)=Ef\left(\mathcal{N}\left(0,1\right)\right)\phantom{\rule{1em}{0ex}}a.s.$
(3.3)

from Section 2 of Peligrad and Shao [7] and Theorem 7.1 on P42 from Billingsley [8]. Hence, to prove (3.3), it suffices to show that

${T}_{n}=\frac{1}{\text{log}n}\sum _{k=1}^{n}\frac{1}{k}\left[f\left(\frac{{S}_{k,k}}{{\sigma }_{k}}\right)-Ef\left(\frac{{S}_{k,k}}{{\sigma }_{k}}\right)\right]\to 0\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}n\to \infty$
(3.4)

by (3.2). Let ${\xi }_{k}=f\left(\frac{{S}_{k,k}}{{\sigma }_{k}}\right)-Ef\left(\frac{{S}_{k,k}}{{\sigma }_{k}}\right)$, 1 ≤ kn m we have

$\begin{array}{ll}\hfill E{T}_{n}^{2}& =\frac{1}{{\text{log}}^{2}n}E{\left(\sum _{k=1}^{n}\frac{{\xi }_{k}}{k}\right)}^{2}\phantom{\rule{2em}{0ex}}\\ \le \frac{1}{{\text{log}}^{2}n}\sum _{1\le k\le l\le n,2k\ge l}\frac{\left|E{\xi }_{k}{\xi }_{l}\right|}{kl}+\frac{1}{{\text{log}}^{2}n}\sum _{1\le k\le l\le n,2k\ge l}\frac{\left|E{\xi }_{k}{\xi }_{l}\right|}{kl}\phantom{\rule{2em}{0ex}}\\ :={I}_{1}+{I}_{2}.\phantom{\rule{2em}{0ex}}\end{array}$
(3.5)

By the fact that f is bounded, we have

${I}_{1}\le \frac{C}{{\text{log}}^{2}n}\sum _{k=1}^{n}\sum _{l=k}^{2k}\frac{1}{kl}=\frac{C}{{\text{log}}^{2}n}\sum _{k=1}^{n}\frac{1}{k}\sum _{l=k}^{2k}\frac{1}{l}\le C\left({\text{log}}^{-1}n\right).$
(3.6)

Now we estimate I2, if l > 2k, we have

$\begin{array}{ll}\hfill {S}_{l,l}-{S}_{2k,2k}& =\left({b}_{1,l}{Y}_{1}+{b}_{2,l}{Y}_{2}+\cdots +{b}_{l,l}{Y}_{l}\right)-\left({b}_{1,2k}{Y}_{1}+{b}_{2,2k}{Y}_{2}+\cdots +{b}_{2k,2k}{Y}_{2k}\right)\phantom{\rule{2em}{0ex}}\\ =\left({b}_{2k+1,l}{Y}_{2k+1}+\cdots +{b}_{l,l}{Y}_{l}\right)+{b}_{2k+1,l}{\stackrel{̃}{S}}_{2k},\phantom{\rule{2em}{0ex}}\end{array}$

and

$\begin{array}{ll}\hfill \left|E{\xi }_{k}{\xi }_{l}\right|& =\left|Cov\left(f\left(\frac{{S}_{k,k}}{{\sigma }_{k}}\right),f\left(\frac{{S}_{l,l}}{{\sigma }_{l}}\right)\right)\right|\phantom{\rule{2em}{0ex}}\\ \le \left|Cov\left(f\left(\frac{{S}_{k,k}}{{\sigma }_{k}}\right),f\left(\frac{{S}_{l,l}}{{\sigma }_{l}}\right)-f\left(\frac{{S}_{l,l}-{S}_{2k,2k}-{b}_{2k+1,l}{\stackrel{̃}{S}}_{2k}}{{\sigma }_{l}}\right)\right)\right|\phantom{\rule{2em}{0ex}}\\ +\left|Cov\left(f\left(\frac{{S}_{k,k}}{{\sigma }_{k}}\right),f\left(\frac{{S}_{l,l}-{S}_{2k,2k}-{b}_{2k+1,l}{\stackrel{̃}{S}}_{2k}}{{\sigma }_{l}}\right)\right)\right|.\phantom{\rule{2em}{0ex}}\end{array}$

By Lemma 2.3 and condition (a2) in Theorem 1.1, we have

$\begin{array}{ll}\hfill Var\left({S}_{k,k}\right)& =\sum _{i=1}^{k}{b}_{i,k}^{2}E{Y}_{i}^{2}+2\sum _{j=1}^{k-1}\sum _{i=j+1}^{k}{b}_{i,k}{b}_{j,k}Cov\left({Y}_{i},{Y}_{j}\right)\phantom{\rule{2em}{0ex}}\\ \le \sum _{i=1}^{k}{b}_{i,k}^{2}+2\sum _{j=1}^{k}{b}_{j,k}^{2}\sum _{i=j+1}^{k}\left|Cov\left({Y}_{i},{Y}_{j}\right)\right|\phantom{\rule{2em}{0ex}}\\ \le Ck,\phantom{\rule{2em}{0ex}}\end{array}$
(3.7)

and

$\begin{array}{l}\phantom{\rule{1em}{0ex}}Var\left({S}_{l,l}-{S}_{2k,2k}-{b}_{2k+1,l}{\stackrel{̃}{S}}_{2k}\right)\phantom{\rule{2em}{0ex}}\\ =\sum _{i=2k+1}^{l}{b}_{i,l}^{2}E{Y}_{i}^{2}+2\sum _{j=2k+1}^{l-1}\sum _{i=j+1}^{l}{b}_{i,l}{b}_{j,l}Cov\left({Y}_{i},{Y}_{j}\right)\phantom{\rule{2em}{0ex}}\\ \le \sum _{i=2k+1}^{l}{b}_{i,l}^{2}+2\sum _{j=1}^{l}{b}_{i,l}^{2}\sum _{i=j+1}^{l}\left|Cov\left({Y}_{i},{Y}_{j}\right)\right|\phantom{\rule{2em}{0ex}}\\ \le Cl.\phantom{\rule{2em}{0ex}}\end{array}$

By Lemma 2.6, the definition of ρ--mixing sequence and condition (a4), we have

By the inequality ${∥X∥}_{2,1}\le \frac{r}{r-2}{∥X∥}_{r}\left(r>2\right)$ (cf. Zhang [[2], p. 254] or Ledoux and Talagrand [[9], p. 251]), we get

${∥\frac{{S}_{k,k}}{{\sigma }_{k}}∥}_{2,1}\le \frac{r}{r-2}{∥\frac{{S}_{k,k}}{{\sigma }_{k}}∥}_{r}=\frac{r}{r-2}\frac{1}{{\sigma }_{k}}{\left(E{\left|{S}_{k,k}\right|}^{r}\right)}^{\frac{1}{r}},$

and

$\begin{array}{ll}\hfill E{\left|{S}_{k,k}\right|}^{r}& =E|\sum _{j=1}^{k}{b}_{j,k}{Y}_{j}{|}^{r}\phantom{\rule{2em}{0ex}}\\ \le C\left\{\sum _{j=1}^{k}{b}_{j,k}^{r}E{\left|{X}_{j}\right|}^{r}+{\left(\sum _{j=1}^{k}{b}_{j,k}^{2}E{X}_{j}^{2}\right)}^{\frac{r}{2}}\right\}\phantom{\rule{2em}{0ex}}\\ \le C\left\{k\left({\text{log}}^{r}k\right)+{k}^{\frac{r}{2}}\right\},\phantom{\rule{2em}{0ex}}\end{array}$

thus

${∥\frac{{S}_{k,k}}{{\sigma }_{k}}∥}_{2,1}\le C\left(\frac{r}{r-2}\cdot \frac{\text{log}k}{{k}^{\frac{1}{2}-\frac{1}{r}}}+\frac{r}{r-2}\right)

similarly,

${∥\frac{{S}_{l,l}-{S}_{2k,2k}-{b}_{2k+1,l}{\stackrel{̃}{S}}_{2k}}{{\sigma }_{l}}∥}_{2,1}

hence

$\left|Cov\left(f\left(\frac{{S}_{k,k}}{{\sigma }_{k}}\right),f\left(\frac{{S}_{l,l}-{S}_{2k,2k}-{b}_{2k+1,l}{\stackrel{̃}{S}}_{2k}}{{\sigma }_{l}}\right)\right)\right|\le C{\rho }^{-}\left(k\right).$

Similarly to (3.7), we have

$\begin{array}{ll}\hfill Var\left({S}_{2k,2k}\right)& =\sum _{i=1}^{2k}{b}_{i,2k}^{2}E{Y}_{i}^{2}+2\sum _{j=1}^{2k-1}\sum _{i=j+1}^{2k}{b}_{i,2k}{b}_{j,2k}Cov\left({Y}_{i},{Y}_{j}\right)\phantom{\rule{2em}{0ex}}\\ \le \sum _{i=1}^{2k}{b}_{i,2k}^{2}+2\sum _{j=1}^{2k-1}{b}_{i,2k}^{2}\sum _{i=j+1}^{2k}\left|Cov\left({Y}_{i},{Y}_{j}\right)\right|\phantom{\rule{2em}{0ex}}\\ \le Ck,\phantom{\rule{2em}{0ex}}\end{array}$

and

$\begin{array}{ll}\hfill Var\left({\stackrel{̃}{S}}_{2k}\right)& =Var\left(\sum _{i=1}^{2k}{Y}_{i}\right)\phantom{\rule{2em}{0ex}}\\ =\sum _{i=1}^{2k}E{Y}_{i}^{2}+2\sum _{i=1}^{2k-1}\sum _{j=i+1}^{2k}\left|Cov\left({Y}_{i},{Y}_{j}\right)\right|\phantom{\rule{2em}{0ex}}\\ =2k+2\sum _{i=1}^{2k-1}\sum _{j=2}^{2k-i+1}Cov\left({Y}_{i},{Y}_{j}\right)\phantom{\rule{2em}{0ex}}\\ \le Ck.\phantom{\rule{2em}{0ex}}\end{array}$

Since f is a bounded Lipschitz function, we have

$\begin{array}{l}\phantom{\rule{1em}{0ex}}\left|Cov\left(f\left(\frac{{S}_{k,k}}{{\sigma }_{k}}\right),f\left(\frac{{S}_{l,l}}{{\sigma }_{l}}\right)-f\left(\frac{{S}_{l,l}-{S}_{2k,2k}-{b}_{2k+1,l}{\stackrel{̃}{S}}_{2k}}{{\sigma }_{l}}\right)\right)\right|\phantom{\rule{2em}{0ex}}\\ \le CE\frac{\left|{S}_{2k,2k}+{b}_{2k+1,l}{\stackrel{̃}{S}}_{2k}\right|}{{\sigma }_{l}}\phantom{\rule{2em}{0ex}}\\ \le C\frac{{\left(Var\left({S}_{2k,2k}\right)\right)}^{\frac{1}{2}}}{{\sigma }_{l}}+C\frac{{b}_{2k+1,l}{\left(Var\left({\stackrel{̃}{S}}_{2k}\right)\right)}^{\frac{1}{2}}}{{\sigma }_{l}}\phantom{\rule{2em}{0ex}}\\ \le C{\left(\frac{k}{l}\right)}^{\frac{1}{2}}+C{\left(\frac{k}{l}\right)}^{\frac{1}{2}}\text{log}\frac{1}{2k}\phantom{\rule{2em}{0ex}}\\ \le C{\left(\frac{k}{l}\right)}^{\epsilon },\phantom{\rule{2em}{0ex}}\end{array}$

where $0<\epsilon <\frac{1}{2}$. Hence if l > 2k, we have

$\left|E{\xi }_{k}{\xi }_{l}\right|\le C\left[{\rho }^{-}\left(k\right)+{\left(\frac{k}{l}\right)}^{\epsilon }\right].$

Thus

$\begin{array}{ll}\hfill {I}_{2}& \le \frac{C}{{\text{log}}^{2}n}\sum _{l=2}^{n}\sum _{k=1}^{l-1}\frac{1}{{k}^{1-\epsilon }{l}^{1+\epsilon }}+\frac{C}{{\text{log}}^{2}n}\sum _{l=2}^{n}\frac{1}{l}\sum _{k=1}^{l-1}\frac{{\rho }^{-}\left(k\right)}{k}\phantom{\rule{2em}{0ex}}\\ \le \frac{C}{{\text{log}}^{2}n}\sum _{l=2}^{n}\frac{1}{{l}^{1+\epsilon }}\frac{{\left(l-1\right)}^{\epsilon }}{\epsilon }+\frac{C}{{\text{log}}^{2}n}\sum _{l=2}^{n}\frac{1}{l}\sum _{k=1}^{n}\frac{{\text{log}}^{-\delta }k}{k}\phantom{\rule{2em}{0ex}}\\ \le \frac{C}{{\text{log}}^{2}n}\sum _{l=2}^{n}\frac{1}{l}+\frac{C}{{\text{log}}^{2}n}\sum _{l=2}^{n}\frac{1}{l}\sum _{k=1}^{n}\frac{{\text{log}}^{-\delta }k}{k}\phantom{\rule{2em}{0ex}}\\ \le C{\text{log}}^{-1}n.\phantom{\rule{2em}{0ex}}\end{array}$
(3.8)

Associated with (3.5), (3.6), and (3.8), we have

$E{T}_{n}^{2}\le C{\text{log}}^{-1}n.$
(3.9)

To prove (3.4), let ${n}_{k}={e}^{{k}^{\tau }}$, where τ > 1. From (3.9), we have

$\sum _{k=1}^{\infty }E{T}_{{n}_{k}}^{2}\le C\sum _{k=1}^{\infty }{\text{log}}^{-1}{n}_{k}=C\sum _{k=1}^{\infty }\frac{1}{{k}^{\tau }}<\infty .$

Thus ε > 0, we have

$\sum _{k=1}^{\infty }P\left\{\left|{T}_{{n}_{k}}\right|\ge \epsilon \right\}\le \sum _{k=1}^{\infty }\frac{E{T}_{{n}_{k}}^{2}}{{\epsilon }^{2}}<\infty .$

By Borel-Cantelli lemma, we have

${T}_{{n}_{k}\to 0}\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}k\to \infty .$

Note that

$\frac{\text{log}{n}_{k+1}}{\text{log}{n}_{k}}=\frac{{\left(k+1\right)}^{\tau }}{{k}^{\tau }}\to 1\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}k\to \infty .$

For every n, there exist n k and nk+1satisfying n k < nnk+1, we have

$\begin{array}{ll}\hfill \left|{T}_{n}\right|& \le \frac{1}{\text{log}{n}_{k}}\left|\sum _{i=1}^{{n}_{k}}\frac{{\xi }_{i}}{i}\right|+\frac{1}{\text{log}{n}_{k}}\sum _{i={n}_{k}}^{{n}_{k+1}}\frac{\left|{\xi }_{i}\right|}{i}\phantom{\rule{2em}{0ex}}\\ \le \left|{T}_{{n}_{k}}\right|+C\left(\frac{\text{log}{n}_{k+1}}{\text{log}{n}_{k}}-1\right)\to 0\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}as\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}n\to \infty ,\phantom{\rule{2em}{0ex}}\end{array}$

(3.4) is completed, so the proof of Theorem 1.1 is completed.

Proof of Theorem 1.2 Let ${C}_{i}=\frac{{S}_{i}}{{\mu }_{i}}$, we have

$\frac{1}{\gamma {\sigma }_{k}}\sum _{i=1}^{k}\left({C}_{i}-1\right)=\frac{1}{\gamma {\sigma }_{k}}\sum _{i=1}^{k}\left(\frac{{S}_{i}}{\mu i}-1\right)=\frac{1}{{\sigma }_{k}}\sum _{i=1}^{k}{b}_{i,k}{Y}_{i}=\frac{{S}_{k,k}}{{\sigma }_{k}}.$

Hence (1.1) is equivalent to

$\forall x\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}\underset{n\to \infty }{\text{lim}}\frac{1}{\text{log}n}\sum _{k=1}^{n}\frac{1}{k}I\left\{\frac{1}{\gamma {\sigma }_{k}}\sum _{i=1}^{k}\left({C}_{i}-1\right)\le x\right\}=\Phi \left(x\right)\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.$
(3.10)

On the other hand, to prove (1.2), it suffices to show that

$\forall x\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}\underset{n\to \infty }{\text{lim}}\frac{1}{\text{log}n}\sum _{k=1}^{n}\frac{1}{k}I\left\{\frac{1}{\gamma {\sigma }_{k}}\sum _{i=1}^{k}\text{log}{C}_{i}\le x\right\}=\Phi \left(x\right)\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.$
(3.11)

By Lemma 2.7, for enough large i, for some $\frac{4}{3} we have

$\left|{C}_{i}-1\right|=\left|\frac{{S}_{i}}{\mu i}-1\right|\le C{i}^{\frac{1}{p}-1}\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.$

It is easy to know that log(1+ x) = x + O(x2) for $\left|x\right|<\frac{1}{2}$, thus

$\left|\sum _{k=1}^{n}\text{log}{C}_{k}-\sum _{k=1}^{n}\left({C}_{k}-1\right)\right|\le C\sum _{k=1}^{n}{\left({C}_{k}-1\right)}^{2}\le C\sum _{k=1}^{n}{k}^{\frac{2}{p}-2}\le C{n}^{\frac{2}{p}-1}\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.,$

and

$\sum _{k=1}^{n}\left({C}_{k}-1\right)-C{n}^{\frac{2}{p}-1}\le \sum _{k=1}^{n}\text{log}{C}_{k}\le \sum _{k=1}^{n}\left({C}_{k}-1\right)+C{n}^{\frac{2}{p}-1}\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}a.s.$

Hence for arbitrary small ε > 0, there is n0 = n0(ω, ε), such that for every n > n0 and arbitrary x,

$I\left\{\frac{1}{\gamma {\sigma }_{k}}\sum _{i=1}^{k}\left({C}_{i}-1\right)\le x-\epsilon \right\}\le I\left\{\frac{1}{\gamma {\sigma }_{k}}\sum _{i=1}^{k}\text{log}{C}_{i}\le x\right\}\le I\left\{\frac{1}{\gamma {\sigma }_{k}}\sum _{i=1}^{k}\left({C}_{i}-1\right)\le x+\epsilon \right\},$

so by (3.10), we know that (3.11) is true, and (3.11) is equivalent to (1.2), thus the proof of Theorem 1.2 is complete.

## References

1. Zhang LX, Wang XY: Convergence rates in the strong laws of asymptotically negatively associated random fields. Appl Math J Chinese Univ Ser B 1999, 14(4):406–416. 10.1007/s11766-999-0070-6

2. Zhang LX: A functional central limit theorem for asymptotically negatively dependent random fields. Acta Math Hungar 2000, 86(3):237–259. 10.1023/A:1006720512467

3. Zhang LX: Central limit theorems for asymptotically negatively associated random fields. Acta Math Sinica 2000, 6(4):691–710.

4. Wang JF, Lu FB: Inequalities of maximum of partial sums and weak convergence for a class of weak dependent random variables. Acta Math Sinica 2006, 22(3):693–700. 10.1007/s10114-005-0601-x

5. Zhou H: Note on the almost sure central limit theorem for ρ--mixing sequences. J Zhejiang Univ Sci Ed 2005, 32(5):503–505.

6. Khurelbaatar G, Rempala G: A note on the almost sure central limit theorem for the product of partial sums. Appl Math Lett 2006, 19(2):191–196. 10.1016/j.aml.2005.06.002

7. Peilgrad M, Shao QM: A note on the almost sure central limit theorem. Statist Probab Lett 1995, 22: 131–136. 10.1016/0167-7152(94)00059-H

8. Billingsley P: Convergence of Probability Measures. Wiley, New York; 1968.

9. Ledoux M, Talagrand M: Probability in Banach Space. Springer Verlag, New York; 1991.

## Acknowledgements

The authors were very grateful to the editor and anonymous referees for their careful reading of the manuscript and valuable suggestions which helped in significantly improving an earlier version of this article. The study was supported by the National Natural Science Foundation of China (10926169, 11171003, 11101180), Key Project of Chinese Ministry of Education (211039), Foundation of Jilin Educational Committee of China (2012-158), and Basic Research Foundation of Jilin University (201001002, 201103204).

## Author information

Authors

### Corresponding author

Correspondence to Xili Tan.

### Competing interests

The authors declare that they have no competing interests.

### Authors' contributions

XT and YZ carried out the design of the study and performed the analysis. YZ participated in its design and coordination. All authors read and approved the final manuscript.

## Rights and permissions

Reprints and Permissions

Tan, X., Zhang, Y. & Zhang, Y. An almost sure central limit theorem of products of partial sums for ρ--mixing sequences. J Inequal Appl 2012, 51 (2012). https://doi.org/10.1186/1029-242X-2012-51