# On the almost sure central limit theorem for self-normalized products of partial sums of ϕ-mixing random variables

## Abstract

Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of strictly stationary ϕ-mixing positive random variables which are in the domain of attraction of the normal law with $E{X}_{1}=\mu >0$, possibly infinite variance and mixing coefficient rates $\varphi \left(n\right)$ satisfying ${\sum }_{n\ge 1}{\varphi }^{1/2}\left({2}^{n}\right)<\mathrm{\infty }$. Under suitable conditions, we here give an almost sure central limit theorem for self-normalized products of partial sums, i.e.,

where F is the distribution function of the random variable ${e}^{\sqrt{2}\mathcal{N}}$ and $\mathcal{N}$ is a standard normal random variable.

MSC:60F15.

## 1 Introduction and main results

The almost sure central limit theorem (ASCLT) was first introduced independently by Brosamler  and Schatte . Since then, many interesting results have been discovered in this field. The classical ASCLT states that when $EX=0$, $Var\left(X\right)={\sigma }^{2}$,

(1.1)

Here and in the sequel, $I\left\{\cdot \right\}$ denotes an indicator function and $\mathrm{\Phi }\left(\cdot \right)$ is the distribution function of the standard normal random variable. It is known (see Berkes ) that the class of sequences satisfying the ASCLT is larger than the class of sequences satisfying the central limit theorem. In recent years, the ASCLT for products of partial sums has received more and more attention. We refer to Gonchigdanzan and Rempala  on the ASCLT for the products of partial sums, Gonchigdanzan  on the ASCLT for the products of partial sums with stable distribution. Li and Wang  and Zhang et al.  showed ASCLT for products of sums and products of sums of partial sums under association. Huang and Pang , Zhang and Yang  obtained the ASCLT results of self-normalized versions. Zhang and Yang  proved the following ASCLT for self-normalized products of sums of i.i.d. random variables.

Theorem A Let $\left\{X,{X}_{n},n\ge 1\right\}$ be a sequence of i.i.d. positive random variables with $\mu =EX>0$, and assume that X is in the domain of attraction of the normal law. Then

$\underset{n\to \mathrm{\infty }}{lim}\frac{1}{logn}\sum _{k=1}^{n}\frac{1}{k}I\left({\left(\prod _{j=1}^{k}\frac{{S}_{j}}{j\mu }\right)}^{\mu /{V}_{k}}\le x\right)=F\left(x\right)\phantom{\rule{1em}{0ex}}\mathit{\text{a.s. for any}}\phantom{\rule{0.1em}{0ex}}x\in R,$
(1.2)

where $F\left(\cdot \right)$ is the distribution function of the random variable ${e}^{\sqrt{2}\mathcal{N}}$ and $\mathcal{N}$ is a standard normal random variable.

A wide literature concerning the ASCLT of self-normalized versions of independent random variables is now available, while the ASCLT for self-normalized versions of weakly dependent random variables is worth studying. Recalling that $\left\{{X}_{n},n\ge 1\right\}$ is a sequence of random variables and ${\mathcal{F}}_{a}^{b}$ denotes the σ-field generated by the random variables ${X}_{a},{X}_{a+1},\dots ,{X}_{b}$. The sequence $\left\{{X}_{n},n\ge 1\right\}$ is called ϕ-mixing if

The sequence $\left\{{X}_{n},n\ge 1\right\}$ is called ρ-mixing if

where ${L}_{2}\left({\mathcal{F}}_{a}^{b}\right)$ is a set of all ${\mathcal{F}}_{a}^{b}$-measurable random variables with second moments. It is well known that $\rho \left(n\right)\le 2{\varphi }^{1/2}\left(n\right)$, and hence a ϕ-mixing sequence is ρ-mixing.

Theorem B (Balan and Kulik [10, 11])

Let $\left\{{X}_{n},n\ge 1\right\}$ be a strictly stationary ϕ-mixing sequence of nondegenerate random variables such that $E{X}_{1}=0$ and ${X}_{1}$ belongs to the domain of attraction of the normal law. Let ${S}_{n}={\sum }_{i=1}^{n}{X}_{i}$ and ${\overline{V}}^{2}={\sum }_{i=1}^{n}{X}_{i}^{2}$. Suppose that $\varphi \left(1\right)<1$ and the mixing coefficients $\varphi \left(n\right)$ satisfy ${\sum }_{n\ge 1}{\varphi }^{1/2}\left({2}^{n}\right)<\mathrm{\infty }$, then

$\text{(i)}\phantom{\rule{1em}{0ex}}\frac{{S}_{n}}{{\overline{A}}_{n}}\stackrel{d}{\to }\mathcal{N}\left(0,1\right)\phantom{\rule{1em}{0ex}}\mathit{\text{and}}\phantom{\rule{1em}{0ex}}\frac{{\overline{V}}_{n}}{{\overline{B}}_{n}}\stackrel{p}{\to }1,$

where

${\overline{A}}_{n}^{2}=Var\left(\sum _{i=1}^{n}{X}_{i}I\left\{|{X}_{i}|\le {\tau }_{i}\right\}\right),\phantom{\rule{2em}{0ex}}{\overline{B}}_{n}^{2}=\sum _{i=1}^{n}Var\left({X}_{i}I\left\{|{X}_{i}|\le {\tau }_{i}\right\}\right),$

and ${\tau }_{i}=inf\left\{s:s\ge 1,\frac{L\left(s\right)}{{s}^{2}}\le \frac{1}{i}\right\}$ for $i=1,2,\dots$ .

In this paper we study the almost sure central limit theorem, containing the general weight sequences, for weakly dependent random variables. Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of strictly stationary ϕ-mixing positive random variables which are in the domain of attraction of the normal law with $E{X}_{1}=\mu >0$, possibly infinite variance and mixing coefficients $\varphi \left(n\right)$ satisfying ${\sum }_{n\ge 1}{\varphi }^{1/2}\left({2}^{n}\right)<\mathrm{\infty }$. We here give an almost sure central limit theorem for self-normalized products of partial sums under a fairly general condition.

Throughout this paper, the following notations are frequently used. For any two positive sequences, ${a}_{n}\ll {b}_{n}$ means that for a certain numerical constant C not depending on n, we have ${a}_{n}\le C{b}_{n}$ for all n, and ${a}_{n}\sim {b}_{n}$ means ${a}_{n}/{b}_{n}\to 1$ as $n\to \mathrm{\infty }$. $\left[x\right]$ denotes the largest integer smaller or equal to x, and C denotes a generic positive constant, whose value can differ in different places.

We let $l\left(x\right)=E{\left({X}_{1}-\mu \right)}^{2}I\left\{|{X}_{1}-\mu |\le x\right\}$, $b=inf\left\{x\ge 1:l\left(x\right)>0\right\}$ and

${\eta }_{n}=inf\left\{s:s\ge b+1,\frac{l\left(s\right)}{{s}^{2}}\le \frac{1}{n}\right\},\phantom{\rule{1em}{0ex}}n=1,2,\dots ,$
(1.3)

then it is easy to see that $nl\left({\eta }_{n}\right)\sim {\eta }_{n}^{2}$ and ${\eta }_{n}\le {\eta }_{n+1}$ (cf. de la Pena et al. ). We denote

${A}_{n}^{2}=Var\left(\sum _{j=1}^{n}\left({X}_{j}-\mu \right)I\left\{|{X}_{j}-\mu |\le {\eta }_{n}\right\}\right),\phantom{\rule{2em}{0ex}}{B}_{n}^{2}=\sum _{j=1}^{n}Var\left(\left({X}_{j}-\mu \right)I\left\{|{X}_{j}-\mu |\le {\eta }_{n}\right\}\right).$

Our main theorem is as follows.

Theorem 1.1 Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of strictly stationary ϕ-mixing positive random variables with $E{X}_{1}=\mu >0$, possibly infinite variance. Assume that ${X}_{1}$ belongs to the domain of attraction of the normal law, and the mixing coefficients $\varphi \left(n\right)$ satisfy ${\sum }_{n\ge 1}{\varphi }^{1/2}\left({2}^{n}\right)<\mathrm{\infty }$. Denote ${S}_{n}={\sum }_{i=1}^{n}{X}_{i}$ and ${V}_{n}^{2}={\sum }_{i=1}^{n}{\left({X}_{i}-\mu \right)}^{2}$. If, moreover,

${A}_{n}^{2}\sim {\beta }^{2}{B}_{n}^{2}\phantom{\rule{1em}{0ex}}\mathit{\text{for some}}\phantom{\rule{0.1em}{0ex}}\beta \in \left(0,\mathrm{\infty }\right),$

then we have

$\underset{n\to \mathrm{\infty }}{lim}\frac{1}{{D}_{n}}\sum _{k=1}^{n}{d}_{k}I\left({\left(\prod _{j=1}^{k}\frac{{S}_{j}}{j\mu }\right)}^{\mu /\left(\beta {V}_{k}\right)}\le x\right)=F\left(x\right)\phantom{\rule{1em}{0ex}}\mathit{\text{a.s. for any}}\phantom{\rule{0.1em}{0ex}}x\in R,$
(1.4)

where $F\left(\cdot \right)$ is the distribution function of the random variable ${e}^{\sqrt{2}\mathcal{N}}$, $\mathcal{N}$ is a standard normal random variable and

${d}_{k}={k}^{-1}exp\left({ln}^{\alpha }k\right),\phantom{\rule{1em}{0ex}}0\le \alpha <1/2,\phantom{\rule{1em}{0ex}}{D}_{n}=\sum _{k=1}^{n}{d}_{k}.$
(1.5)

Remark 1.1

If we assume that

$\underset{n\to \mathrm{\infty }}{lim}\frac{1}{l\left({\eta }_{n}\right)}\sum _{j=2}^{n}Cov\left({X}_{1}I\left\{|{X}_{1}|\le {\eta }_{n}\right\},{X}_{j}I\left\{|{X}_{j}|\le {\eta }_{n}\right\}\right)=\alpha >-1/2,$

then ${A}_{n}^{2}\sim {\beta }^{2}{B}_{n}^{2}$ with ${\beta }^{2}=1+2\alpha$.

We have the following corollaries.

Corollary 1.1 Let $\left\{{X}_{n},n\ge 1\right\}$ be a strictly stationary ϕ-mixing sequence of positive random variables such that $E{X}_{1}=\mu >0$, $Var\left({X}_{1}\right)={\sigma }^{2}<\mathrm{\infty }$ and ${\sum }_{j\ge 2}|E{X}_{1}{X}_{j}|<\mathrm{\infty }$, then (1.4) holds.

Corollary 1.2 Let $\left\{{X}_{n},n\ge 1\right\}$ be a strictly stationary ϕ-mixing sequence of positive random variables such that $E{X}_{1}=\mu >0$, $Var\left({X}_{1}\right)={\sigma }^{2}<\mathrm{\infty }$ and ${\sum }_{n\ge 1}{\varphi }^{1/2}\left({2}^{n}\right)<\mathrm{\infty }$. Set ${S}_{n}={\sum }_{i=1}^{n}{X}_{i}$ and ${\sigma }_{n}^{2}=Var\left({S}_{n}^{2}\right)$, then (1.4) holds.

Remark 1.2 Let ${d}_{k}=1/k$ and $\beta =1$. If $\left\{{X}_{n},n\ge 1\right\}$ is a sequence of i.i.d. positive random variables such that $E{X}_{1}=\mu >0$ and ${X}_{1}$ belongs to the domain of attraction of the normal law, then Theorem 1.1 is just Theorem A.

Remark 1.3 By the terminology of summation procedures (see [, p.35]), Theorem 1.1 remains valid if we replace the weight sequence ${\left\{{d}_{k}\right\}}_{k\ge 1}$ by any ${\left\{{d}_{k}^{\ast }\right\}}_{k\ge 1}$ such that $0\le {d}_{k}^{\ast }\le {d}_{k}$ and ${\sum }_{k\ge 1}{d}_{k}^{\ast }=\mathrm{\infty }$.

## 2 Lemmas

In this section, we introduce some lemmas which are used to prove our theorem.

Lemma 2.1 (Csörgő et al. )

Let X be a random variable, and denote $l\left(y\right)=E{\left(X-\mu \right)}^{2}I\left\{|X-\mu |\le y\right\}$. The following statements are equivalent:

1. (a)

X is in the domain of attraction of the normal law,

2. (b)
${y}^{2}P\left\{|X-\mu |>y\right\}=o\left(l\left(y\right)\right)$

,

3. (c)
$yE|X-\mu |I\left\{|X-\mu |>y\right\}=o\left(l\left(y\right)\right)$

,

4. (d)
$E{|X-\mu |}^{\alpha }I\left\{|X-\mu |\le y\right\}=o\left({y}^{\alpha -2}l\left(y\right)\right)$

for $\alpha >2$.

For all positive integers $1\le i\le k<\mathrm{\infty }$, we denote (2.1)

Lemma 2.2 Let f be a nonnegative, bounded Lipschitz function such that

$f\left(x\right)\le C\phantom{\rule{1em}{0ex}}\mathit{\text{and}}\phantom{\rule{1em}{0ex}}|f\left(x\right)-f\left(y\right)|\le C|x-y|\phantom{\rule{1em}{0ex}}\mathit{\text{for every}}\phantom{\rule{0.1em}{0ex}}x,y\in R.$

If the assumptions of Theorem  1.1 hold and there exists a positive constant ϵ such that

$Var\left(\sum _{k=1}^{n}{d}_{k}f\left(\frac{{\stackrel{˜}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\right)\right)\ll {D}_{n}^{2}{\left(ln{D}_{n}\right)}^{-1-ϵ},$
(2.2)

then we have

$\underset{n\to \mathrm{\infty }}{lim}\frac{1}{{D}_{n}}\sum _{k=1}^{n}{d}_{k}f\left(\frac{{\stackrel{˜}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\right)=Ef\left(\mathcal{N}\left(0,1\right)\right)\phantom{\rule{1em}{0ex}}\mathit{\text{a.s.}}$
(2.3)

Proof From the formula (2.5) in Liu and Lin , we have

$\frac{{\stackrel{˜}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\stackrel{d}{\to }\mathcal{N}\left(0,1\right)$
(2.4)

as $k\to \mathrm{\infty }$ under the hypotheses of Theorem 1.1. Then

$Ef\left(\frac{{\stackrel{˜}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\right)\to Ef\left(\mathcal{N}\left(0,1\right)\right)$

as $k\to \mathrm{\infty }$, which implies from Toeplitz’s lemma that

$\frac{1}{{D}_{n}}\sum _{k=1}^{n}{d}_{k}Ef\left(\frac{{\stackrel{˜}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\right)\to Ef\left(\mathcal{N}\left(0,1\right)\right)$

as $n\to \mathrm{\infty }$. To prove (2.3), we only need to show that

$\underset{n\to \mathrm{\infty }}{lim}\frac{1}{{D}_{n}}\sum _{k=1}^{n}{d}_{k}\left[f\left(\frac{{\stackrel{˜}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\right)-Ef\left(\frac{{\stackrel{˜}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\right)\right]=0\phantom{\rule{1em}{0ex}}\text{a.s.}$
(2.5)

Let

By (2.2), we have

$E{\nu }_{n}^{2}=\frac{1}{{D}_{n}^{2}}Var\left(\sum _{k=1}^{n}{d}_{k}f\left(\frac{{\stackrel{˜}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\right)\right)\ll {\left(ln{D}_{n}\right)}^{-1-ϵ}.$

Note that for $\alpha =0$, we get ${d}_{k}=e/k$, ${D}_{n}\sim elnn$. For $\alpha >0$, we get

$\begin{array}{rcl}{D}_{n}& \sim & {\int }_{0}^{lnn}exp\left({t}^{\alpha }\right)\phantom{\rule{0.2em}{0ex}}dt\\ \sim & {\int }_{0}^{lnn}\left(exp\left({t}^{\alpha }\right)+\frac{1-\alpha }{\alpha }{t}^{-\alpha }exp\left({t}^{\alpha }\right)\right)\phantom{\rule{0.2em}{0ex}}dt\\ =& \frac{1}{\alpha }{\left(lnn\right)}^{1-\alpha }exp\left({ln}^{\alpha }n\right),\end{array}$
(2.6)

and using Karamata’s theorem (see Seneta ),

$exp\left({ln}^{\alpha }x\right)=exp\left({\int }_{1}^{x}\alpha {\left(lnu\right)}^{\alpha -1}/u\phantom{\rule{0.2em}{0ex}}du\right),\phantom{\rule{1em}{0ex}}\alpha <1,$
(2.7)

is a slowly varying function at ∞. Hence ${D}_{n+1}\sim {D}_{n}$. Let γ be such that $0<\gamma <ϵ/\left(1+ϵ\right)$, and ${n}_{k}=inf\left\{n:{D}_{n}\ge exp\left({k}^{1-\gamma }\right)\right\}$, then

${D}_{{n}_{k}}\ge exp\left({k}^{1-\gamma }\right)>{D}_{{n}_{k}-1},$

and thus

$1\le \frac{{D}_{{n}_{k}}}{exp\left({k}^{1-\gamma }\right)}\sim \frac{{D}_{{n}_{k}-1}}{exp\left({k}^{1-\gamma }\right)}<1,$

which means that ${D}_{{n}_{k}}\sim exp\left({k}^{1-\gamma }\right)$. Since $\left(1-\gamma \right)\left(1+ϵ\right)>1$, we have

$\sum _{k=1}^{\mathrm{\infty }}E{\nu }_{{n}_{k}}^{2}\le C\sum _{k=1}^{\mathrm{\infty }}\frac{1}{{k}^{\left(1-\gamma \right)\left(1+ϵ\right)}}<\mathrm{\infty },$

which implies ${\nu }_{{n}_{k}}\to 0$ a.s. For any given n, there exists k such that ${n}_{k}\le n<{n}_{k+1}$. It is easy to see that by the boundedness of f,

$|{\nu }_{n}|\le |{\nu }_{{n}_{k}}|+\frac{1}{{D}_{{n}_{k}}}\sum _{i={n}_{k}}^{{n}_{k+1}}{d}_{i}\le |{\nu }_{{n}_{k}}|+C\left(\frac{{D}_{{n}_{k+1}}}{{D}_{{n}_{k}}}-1\right)\to 0\phantom{\rule{1em}{0ex}}\text{a.s.},$

which yields (2.5). Hence (2.3) holds true. □

Lemma 2.3 Assume f is a nonnegative, bounded Lipschitz function such that $f\left(x\right)\le C$ and $|f\left(x\right)-f\left(y\right)|\le C|x-y|$ for every $x,y\in R$. If there exists a positive constant ϵ such that (2.8) (2.9) (2.10)

Then, under the assumptions of Theorem  1.1, we have (2.11) (2.12) (2.13)

Proof The relations (2.11)-(2.13) follow by the same method as in the proof of Lemma 2.2, and the details are omitted here. □

To prove that under the hypotheses of Theorem 1.1, the relations (2.2) and (2.8)-(2.10) hold true, we show them by using the following four lemmas.

Lemma 2.4 Assume that f is a nonnegative, bounded Lipschitz function such that $f\left(x\right)\le C$ and $|f\left(x\right)-f\left(y\right)|\le C|x-y|$ for every $x,y\in R$. Then, under the assumptions of Theorem  1.1, there exists a positive constant ϵ such that

$Var\left(\sum _{k=1}^{n}{d}_{k}f\left(\frac{{\stackrel{˜}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\right)\right)\ll {D}_{n}^{2}{\left(ln{D}_{n}\right)}^{-1-ϵ}.$
(2.14)

Proof

Write (2.15)

From (2.6), we get

$ln{D}_{n}\sim {ln}^{\alpha }n,\phantom{\rule{2em}{0ex}}exp\left({ln}^{\alpha }n\right)\sim \frac{{D}_{n}}{{\left(ln{D}_{n}\right)}^{\left(1-\alpha \right)/\alpha }}.$
(2.16)

Since f is a nonnegative, bounded Lipschitz function, it follows from (2.16) that for any $0<ϵ<\left(1-2\alpha \right)/\alpha$ with $0\le \alpha <1/2$,

${I}_{1}\le C\sum _{1\le i\le j\le \left(2i\right)\wedge n}{d}_{i}{d}_{j}\le C\frac{{D}_{n}^{2}}{{\left(ln{D}_{n}\right)}^{\left(1-\alpha \right)/\alpha }}\sum _{j=i}^{2i}\frac{1}{j}\ll {D}_{n}^{2}{\left(ln{D}_{n}\right)}^{-1-ϵ}.$
(2.17)

Consider ${I}_{2}$ now. Let ${\stackrel{˜}{Y}}_{2i,j}={\sum }_{k=2i+1}^{j}{b}_{k,j}{\stackrel{˜}{X}}_{kj}^{\ast }={\sum }_{k=2i+1}^{j}{\sum }_{l=k}^{j}\frac{1}{l}{\stackrel{˜}{X}}_{kj}^{\ast }$ for $1\le 2i , then The well-known property of a ϕ-mixing sequence (see [, Lemma 1.2.9]) and the boundedness of f imply $|{I}_{21}|\le C\varphi \left(i\right)$. Since ${\sum }_{n\ge 1}{\varphi }^{1/2}\left({2}^{n}\right)<\mathrm{\infty }$ implies $\varphi \left(n\right)\ll {\left(lnn\right)}^{-1}$, it follows that for any $0<ϵ<\left(1-2\alpha \right)/\alpha$ with $0\le \alpha <1/2$,

$\sum _{1\le 2i
(2.18)

Estimate ${I}_{22}$. Since ${\left\{{X}_{n}\right\}}_{n\ge 1}$ is stationary and ${\sum }_{n=1}^{\mathrm{\infty }}{\varphi }^{1/2}\left({2}^{n}\right)<\mathrm{\infty }$, it follows from the relation (2.2) in Li and Wang  that

$\begin{array}{rcl}E{\left(\sum _{k=1}^{2i}{b}_{k,j}{\stackrel{˜}{X}}_{kj}^{\ast }\right)}^{2}& =& \sum _{k=1}^{2i}{b}_{k,j}^{2}E{\left({\stackrel{˜}{X}}_{kj}^{\ast }\right)}^{2}+2\sum _{k=1}^{2i-1}\sum _{l=k+1}^{2i}{b}_{k,j}{b}_{l,j}E{\stackrel{˜}{X}}_{kj}^{\ast }{\stackrel{˜}{X}}_{lj}^{\ast }\\ =& \sum _{k=1}^{2i}{b}_{k,j}^{2}E{\left({\stackrel{˜}{X}}_{kj}^{\ast }\right)}^{2}+2\sum _{k=2}^{2i}\sum _{l=1}^{2i}{b}_{l,j}^{2}E{\stackrel{˜}{X}}_{1j}^{\ast }{\stackrel{˜}{X}}_{kj}^{\ast }-2\sum _{k=2}^{2i}\sum _{l=2i-k+2}^{2i}{b}_{l,j}^{2}E{\stackrel{˜}{X}}_{1j}^{\ast }{\stackrel{˜}{X}}_{kj}^{\ast }\\ -2\sum _{k=2}^{2i}\sum _{l=1}^{2i+1-k}{b}_{l,j}{b}_{l,l+k-2}E{\stackrel{˜}{X}}_{1j}^{\ast }{\stackrel{˜}{X}}_{kj}^{\ast }\\ \le & \sum _{k=1}^{2i}{b}_{k,j}^{2}\left(E{\left({\stackrel{˜}{X}}_{kj}^{\ast }\right)}^{2}+6\sum _{l=2}^{2i}|E{\stackrel{˜}{X}}_{1j}^{\ast }{\stackrel{˜}{X}}_{lj}^{\ast }|\right)\\ \le & \sum _{k=1}^{2i}{b}_{k,j}^{2}\left(l\left({\eta }_{j}\right)+Cl\left({\eta }_{j}\right)\sum _{l=1}^{\mathrm{\infty }}{\varphi }^{1/2}\left(l\right)\right)\\ \le & Cl\left({\eta }_{j}\right)\sum _{k=1}^{2i}{b}_{k,j}^{2}\end{array}$

by using Lemma 1.2.8 in Lin and Lu . Note that for $n\ge k$, ${\sum }_{i=1}^{k}{log}^{2}\left(n/i\right)\le Ck\left(1+{log}^{2}\left(n/k\right)\right)$. Using the fact that ${\left\{{X}_{n}\right\}}_{n\ge 1}$ is stationary and that f is bounded and Lipschitzian, we get

$\begin{array}{rcl}{I}_{22}& \le & C\frac{E|{\sum }_{k=1}^{2i}{b}_{k,j}{\stackrel{˜}{X}}_{kj}^{\ast }|}{\beta \sqrt{2jl\left({\eta }_{j}\right)}}\\ \le & \frac{C}{\sqrt{2jl\left({\eta }_{j}\right)}}{\left(E{|\sum _{k=1}^{2i}{b}_{k,j}{\stackrel{˜}{X}}_{kj}^{\ast }|}^{2}\right)}^{1/2}\\ \le & C\frac{\sqrt{l\left({\eta }_{j}\right)}}{\sqrt{jl\left({\eta }_{j}\right)}}{\left(\sum _{k=1}^{2i}{\left(\sum _{l=k}^{j}\frac{1}{l}\right)}^{2}\right)}^{1/2}\\ \le & \frac{C}{\sqrt{j}}{\left(\sum _{k=1}^{2i}{log}^{2}\left(\frac{j}{k}\right)\right)}^{1/2}\\ \le & C\frac{\sqrt{2i}}{\sqrt{j}}{\left(1+{log}^{2}\left(j/\left(2i\right)\right)\right)}^{1/2}\\ \le & C\frac{\sqrt{2i}}{\sqrt{j}}\left(1+log\left(j/\left(2i\right)\right)\right)\\ \le & C{\left(2i/j\right)}^{\delta },\end{array}$

where $\delta \in \left(0,1/2\right)$. It follows that

$\begin{array}{rcl}\sum _{1\le 2i
(2.19)

for any $0<ϵ<\left(1-2\alpha \right)/\alpha$ with $0\le \alpha <1/2$. From (2.18) and (2.19), we get

${J}_{2}\le C{D}_{n}^{2}{\left(ln{D}_{n}\right)}^{-1-ϵ}.$
(2.20)

Hence, combining (2.15) with (2.17) and (2.20) yields (2.14). □

Lemma 2.5 Under the hypotheses of Lemma  2.4, there exists a positive constant ϵ such that

$Var\left(\sum _{k=1}^{n}{d}_{k}f\left(\frac{{\stackrel{ˆ}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\right)\right)\ll {D}_{n}^{2}{\left(ln{D}_{n}\right)}^{-1-ϵ}.$
(2.21)

Proof By the same method as in the proof of Lemma 2.4, we show (2.21). We have In the same manner as in (2.17), we can see that ${J}_{1}\le C{D}_{n}^{2}{\left(ln{D}_{n}\right)}^{-1-ϵ}$. Consider ${J}_{2}$ now. Let ${\stackrel{ˆ}{Y}}_{2i,j}={\sum }_{k=2i+1}^{j}{b}_{k,j}{\stackrel{ˆ}{X}}_{kj}^{\ast }={\sum }_{k=2i+1}^{j}{\sum }_{l=k}^{j}\frac{1}{l}{\stackrel{ˆ}{X}}_{kj}^{\ast }$ for $1\le 2i , then As in (2.18), we can see that ${\sum }_{1\le 2i. Estimate ${J}_{22}$. By Lemma 2.1 and ${\eta }_{j}^{2}\sim jl\left({\eta }_{j}\right)$, there exists ${j}_{0}$ such that $E|{X}_{1}-\mu |I\left\{|{X}_{1}-\mu |>{\eta }_{j}\right\}\le l\left({\eta }_{j}\right)/{\eta }_{j}$ for every $j>{j}_{0}$. Using the fact that ${\left\{{X}_{n}\right\}}_{n\ge 1}$ is stationary and that f is bounded and Lipschitzian, we get

$\begin{array}{rcl}{J}_{22}& \le & C\frac{E|{\sum }_{k=1}^{2i}{b}_{k,j}{\stackrel{ˆ}{X}}_{kj}^{\ast }|}{\beta \sqrt{2jl\left({\eta }_{j}\right)}}\le C\frac{E|{\stackrel{ˆ}{X}}_{1j}^{\ast }|}{\sqrt{2jl\left({\eta }_{j}\right)}}\sum _{k=1}^{2i}{b}_{k,j}\\ \le & C\frac{E|{X}_{1}-\mu |I\left\{|{X}_{1}-\mu |>{\eta }_{j}\right\}}{\sqrt{2jl\left({\eta }_{j}\right)}}\left(\sum _{k=1}^{2i}\sum _{l=k}^{2i}\frac{1}{l}+2i{b}_{2i+1,j}\right)\\ \le & \frac{C}{\sqrt{2jl\left({\eta }_{j}\right)}}\frac{l\left({\eta }_{j}\right)}{{\eta }_{j}}\left(2i+2ilog\left(j/\left(2i\right)\right)\right)\\ \le & C{\left(\frac{2i}{j}\right)}^{\delta }\end{array}$

for large enough i with $2i, where $\delta \in \left(0,1\right)$, since for any $\gamma >0$, $logn\le {n}^{\gamma }$ for large n. Similarly, we get by (2.19)

$\sum _{1\le 2i

which means ${J}_{2}\le C{D}_{n}^{2}{\left(ln{D}_{n}\right)}^{-1-ϵ}$, and hence (2.21) is proved. □

Lemma 2.6 Under the hypotheses of Lemma  2.4, there exists a positive constant ϵ such that

$Var\left(\sum _{k=1}^{n}{d}_{k}f\left(\frac{{\stackrel{˜}{V}}_{k}^{2}}{kl\left({\eta }_{k}\right)}\right)\right)\ll {D}_{n}^{2}{\left(ln{D}_{n}\right)}^{-1-ϵ}.$

Proof This follows by the same method as the proof of Lemma 2.4, and the details are omitted. □

Lemma 2.7 Under the hypotheses of Theorem  2.4, there exists a positive constant ϵ such that

$Var\left(\sum _{i=1}^{n}{d}_{i}I\left\{\bigcup _{k=1}^{i}\left\{|{X}_{k}-\mu |>{\eta }_{i}\right\}\right\}\right)\ll {D}_{n}^{2}{\left(ln{D}_{n}\right)}^{1-ϵ}.$
(2.22)

Proof

We have divided the proof into three parts: (2.23)

It is clear from (2.7) and (2.17) that

${L}_{1}\le \sum _{i=1}^{n}\frac{exp\left(2{ln}^{\alpha }i\right)}{{i}^{2}}\le C,\phantom{\rule{2em}{0ex}}{L}_{2}\ll {D}_{n}^{2}{\left(ln{D}_{n}\right)}^{-1-ϵ}.$
(2.24)

Consider ${L}_{3}$ now. It is clear that $I\left(E\cup F\right)-I\left(F\right)\le I\left(E\right)$ for any sets E and F, then we note that for $1\le 2i, From the property of a ϕ-mixing sequence and $\varphi \left(i\right)\ll {\left(logi\right)}^{-1}$, we have

$|Cov\left(I\left\{\bigcup _{k=1}^{i}\left\{|{X}_{k}-\mu |\ge {\eta }_{i}\right\}\right\},I\left\{\bigcup _{k=2i+1}^{j}\left\{|{X}_{k}-\mu |\ge {\eta }_{j}\right\}\right\}\right)|\le C\varphi \left(i\right),$

and hence

$\begin{array}{rcl}\sum _{1\le 2i
(2.25)

for any $0<ϵ<\left(1-2\alpha \right)/\alpha$. By the stationarity of ${\left\{{X}_{n}\right\}}_{n\ge 1}$ and Lemma 2.2(b), we get ${\sum }_{k=1}^{n}P\left\{|{X}_{k}-\mu |\ge {\eta }_{n}\right\}=nP\left\{|{X}_{1}-\mu |\ge {\eta }_{n}\right\}=o\left(1\right)$, which yields $EI\left\{{\bigcup }_{k=1}^{2i}\left\{|{X}_{k}-\mu |\ge {\eta }_{j}\right\}\right\}\le {\sum }_{k=1}^{2i}P\left\{|{X}_{k}-\mu |\ge {\eta }_{j}\right\}=2iP\left\{|{X}_{1}-\mu |\le {\eta }_{j}\right\}\ll 2i/j$, and hence, in the same way as in (2.19),

$\sum _{1\le 2i
(2.26)

From (2.25) and (2.26), it follows that

${L}_{3}\ll {D}_{n}^{2}{\left(ln{D}_{n}\right)}^{-1-ϵ}.$
(2.27)

Therefore, combining (2.23) with (2.24) and (2.27), we obtain (2.22), which is our claim. □

## 3 Proof of Theorem 1.1

Let ${C}_{i}={S}_{i}/\left(i\mu \right)$. To prove Theorem 1.1, it suffices to show that

$\underset{n\to \mathrm{\infty }}{lim}\frac{1}{{D}_{n}}\sum _{k=1}^{n}{d}_{k}I\left\{\frac{\mu }{\beta \sqrt{2}{V}_{k}}\sum _{i=1}^{k}log{C}_{i}\le x\right\}=\mathrm{\Phi }\left(x\right)\phantom{\rule{1em}{0ex}}\text{a.s.}$

for any $x\in R$. For any given $0<ϵ<1$, it is clear that and Hence it suffices to show (3.1) (3.2) (3.3) (3.4)

Let $0<\delta <1/2$ and f be a real function such that for any given $x\in R$,

$I\left\{y\le \sqrt{1±ϵ}\cdot x-\delta \right\}\le {f}_{x}\left(y\right)=f\left(y\right)\le I\left\{\sqrt{1±ϵ}\cdot x+\delta \right\}.$

We first prove that (3.1) holds under condition (2.2). Note that $E{|X|}^{p}<\mathrm{\infty }$ for all $1 since X belongs to the domain of attraction of the normal law. For our purpose, we fix $4/3. By the Marcinkiewicz-Zygmund strong law of a large number for ϕ-mixing sequences (see [, Remark 8.2.1], ), for i large enough, we have

$|{C}_{i}-1|\le {i}^{1/p}-1\phantom{\rule{1em}{0ex}}\text{a.s.}$

It is easy to see that $log\left(1+x\right)-x=O\left({x}^{2}\right)$ as $x\to 0$. Thus

$|\sum _{i=1}^{k}log{C}_{i}-\sum _{i1}^{k}\left({C}_{i}-1\right)|\ll \sum _{i=1}^{k}{\left({C}_{i}-1\right)}^{2}\ll {k}^{2/p-1}\phantom{\rule{1em}{0ex}}\text{a.s.}$

Hence for almost every event ω and any $0<{\delta }_{1}<1/4$, there exists ${k}_{0}={k}_{0}\left(\omega ,{\delta }_{1},x\right)$ such that for $k>{k}_{0}$,

$\begin{array}{rcl}I\left\{\frac{\mu {\sum }_{i=1}^{k}\left({C}_{i}-1\right)}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\le \sqrt{1±ϵ}\cdot x-{\delta }_{1}\right\}& \le & I\left\{\frac{\mu {\sum }_{i=1}^{k}log{C}_{i}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\le \sqrt{1±ϵ}\cdot x\right\}\\ \le & I\left\{\frac{\mu {\sum }_{i=1}^{k}\left({C}_{i}-1\right)}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\le \sqrt{1±ϵ}\cdot x+{\delta }_{1}\right\}.\end{array}$
(3.5)

We note that

$\mu \sum _{i=1}^{k}\left({C}_{i}-1\right)=\sum _{j=1}^{k}\sum _{l=j}^{k}\frac{1}{l}{\stackrel{˜}{X}}_{jk}^{\ast }+\sum _{j=1}^{k}\sum _{l=j}^{k}\frac{1}{l}{\stackrel{ˆ}{X}}_{jk}^{\ast }={\stackrel{˜}{Y}}_{k}+{\stackrel{ˆ}{Y}}_{k}.$

So, for any $0<{\delta }_{2}<1/4$, we have

$\begin{array}{rcl}I\left\{\frac{\mu {\sum }_{i=1}^{k}\left({C}_{i}-1\right)}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\le \sqrt{1±ϵ}\cdot x+{\delta }_{1}\right\}& \le & I\left\{\frac{{\stackrel{˜}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\le \sqrt{1±ϵ}\cdot x+{\delta }_{1}+{\delta }_{2}\right\}\\ +I\left\{\frac{|{\stackrel{ˆ}{Y}}_{k}|}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}>{\delta }_{2}\right\}\end{array}$
(3.6)

and

$\begin{array}{rcl}I\left\{\frac{\mu {\sum }_{i=1}^{k}\left({C}_{i}-1\right)}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\le \sqrt{1±ϵ}\cdot x-{\delta }_{1}\right\}& \ge & I\left\{\frac{{\stackrel{˜}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\le \sqrt{1±ϵ}\cdot x-{\delta }_{1}-{\delta }_{2}\right\}\\ -I\left\{\frac{|{\stackrel{ˆ}{Y}}_{k}|}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}>{\delta }_{2}\right\}.\end{array}$
(3.7)

Let $\lambda ={\delta }_{2}\beta \sqrt{2}$ with $0<{\delta }_{2}<1/4$. By using the fact that ${\left\{{X}_{k}\right\}}_{k\ge 1}$ is stationary and Lemma 2.1(c), we have

$\begin{array}{rcl}P\left\{|{\stackrel{ˆ}{Y}}_{k}|\ge \lambda \sqrt{kl\left({\eta }_{k}\right)}\right\}& \le & P\left\{\sum _{i=1}^{k}{b}_{i,k}|{\stackrel{ˆ}{X}}_{1k}^{\ast }|\ge \lambda \sqrt{kl\left({\eta }_{k}\right)}\right\}\le \frac{\left({\sum }_{i=1}^{k}{b}_{i,k}\right)E|{\stackrel{ˆ}{X}}_{1k}^{\ast }|}{\lambda \sqrt{kl\left({\eta }_{k}\right)}}\\ \le & \frac{2kE|{X}_{1}-\mu |I\left\{|{X}_{1}-\mu |\ge {\eta }_{k}\right\}}{\lambda \sqrt{kl\left({\eta }_{k}\right)}}=o\left(1\right),\end{array}$
(3.8)

and by (2.3) in Lemma 2.2, we get

$\frac{1}{{D}_{n}}\sum _{k=1}^{n}{d}_{k}I\left\{\frac{{\stackrel{˜}{Y}}_{k}}{\beta \sqrt{2kl\left({\eta }_{k}\right)}}\le x\right\}\to \mathrm{\Phi }\left(\sqrt{1±ϵ}\cdot x±{\delta }_{1}±{\delta }_{2}\right)\phantom{\rule{1em}{0ex}}\text{a.s.}$
(3.9)

for any $x\in R$. Hence, combining (3.5)-(3.9) yields (3.1) by the arbitrariness of ${\delta }_{1}$, ${\delta }_{2}$. For (3.2), it is clear from (2.13) in Lemma 2.3 that (3.2) holds true since $P\left({\bigcup }_{i=1}^{k}\left\{|{X}_{i}-\mu |\ge {\eta }_{k}\right\}\right)\le kP\left\{|{X}_{1}-\mu |\ge {\eta }_{k}\right\}=o\left(1\right)$. Consider (3.3). By (2.12) in Lemma 2.3, it suffices to show that

We note that ${\left\{{\stackrel{˜}{X}}_{jk}^{2}-E{\stackrel{˜}{X}}_{jk}^{2}\right\}}_{j=1}^{k}$ is a ϕ-mixing sequence with the same mixing coefficient $\varphi \left(k\right)$. Using again Lemma 2.3 in Shao  and Lemma 1(d), we obtain

${\eta }_{k}^{-4}E{\left(\sum _{j=1}^{k}\left({\stackrel{˜}{X}}_{jk}^{2}-E{\stackrel{˜}{X}}_{jk}^{2}\right)\right)}^{2}\le Ck{\eta }_{k}^{-4}\underset{1\le j\le k}{max}E{\left({\stackrel{˜}{X}}_{jk}^{2}-E{\stackrel{˜}{X}}_{jk}^{2}\right)}^{2}\le Ck{\eta }_{k}^{-4}E{\stackrel{˜}{X}}_{1k}^{4}=o\left(1\right).$

Hence, by Chebyshev’s inequality and again recalling ${\eta }_{k}^{2}\sim kl\left({\eta }_{k}\right)$, we have

$P\left\{|{\stackrel{˜}{V}}_{k}^{2}-E{\stackrel{˜}{V}}_{k}^{2}|>ϵkl\left({\eta }_{k}\right)\right\}\le \frac{E{|{\stackrel{˜}{V}}_{k}^{2}-E{\stackrel{˜}{V}}_{k}^{2}|}^{2}}{{ϵ}^{2}{\left(kl\left({\eta }_{k}\right)\right)}^{2}}\le C{ϵ}^{-2}{\eta }_{k}^{-4}E{\left(\sum _{j=1}^{k}\left({\stackrel{˜}{X}}_{jk}^{2}-E{\stackrel{˜}{X}}_{jk}^{2}\right)\right)}^{2}=o\left(1\right)$

and $E{\stackrel{˜}{V}}_{k}^{2}={\sum }_{i=1}^{k}l\left({\eta }_{i}\right)\sim kl\left({\eta }_{k}\right)$, which implies that

$P\left\{{\stackrel{˜}{V}}_{k}^{2}>\left(1+ϵ\right)kl\left({\eta }_{k}\right)\right\}\le P\left\{{\stackrel{˜}{V}}_{k}^{2}-E{\stackrel{˜}{V}}_{k}^{2}>\frac{ϵ}{2}kl\left({\eta }_{k}\right)\right\}=o\left(1\right),$

and hence (3.3) holds true. Similarly,

$P\left\{{\stackrel{˜}{V}}_{k}^{2}<\left(1-ϵ\right)kl\left({\eta }_{k}\right)\right\}=o\left(1\right),$

which implies that (3.4). The proof is completed.

## References

1. Brosamler GA: An almost everywhere central limit theorem. Math. Proc. Camb. Philos. Soc. 1988, 104: 561–574. 10.1017/S0305004100065750

2. Schatte P: On strong versions of the central limit theorem. Math. Nachr. 1988, 137: 249–256. 10.1002/mana.19881370117

3. Berkes I: Results and problems related to the pointwise central limit theorem. In Asymptotic Results in Probability and Statistics (A Volume in Honor of Miklós Csörgő). Edited by: Szyszkowicz B. Elsevier, Amsterdam; 1998:59–60.

4. Gonchigdanzan K, Rempala GA: A note on the almost sure limit theorem for the product of partial sums. Appl. Math. Lett. 2006, 19: 191–196. 10.1016/j.aml.2005.06.002

5. Gonchigdanzan K: An almost sure limit theorem for the product of partial sums with stable distribution. Stat. Probab. Lett. 2008, 78: 3170–3175. 10.1016/j.spl.2008.06.003

6. Li YX, Wang JF: An almost sure central limit theorem for products of sums under association. Stat. Probab. Lett. 2008, 78(4):367–375. 10.1016/j.spl.2007.07.009

7. Zhang Y, Yang XY, Dong ZS: An almost sure central limit theorem for products of sums of partial sums under association. J. Math. Anal. Appl. 2009, 355(2):708–716. 10.1016/j.jmaa.2009.01.071

8. Huang SH, Pang TX: An almost sure central limit theorem for self-normalized partial sums. Comput. Math. Appl. 2010, 60: 2639–2644. 10.1016/j.camwa.2010.08.093

9. Zhang Y, Yang XY: An almost sure central limit theorem for self-normalized products of sums of i.i.d. random variables. J. Math. Anal. Appl. 2011, 376: 29–41. 10.1016/j.jmaa.2010.10.021

10. Balan, RM, Kulik, R: Self-normalized weak invariance principle for mixing sequences. Tech. Rep. Ser. 417, Lab. Reas. Probab. Stat., Univ. Ottawa-Carleton Univ. (2005)

11. Balan RM, Kulik R: Weak invariance principle for mixing sequences in the domain of attraction of normal law. Studia Sci. Math. Hung. 2009, 46(3):329–343.

12. de la Pena H, Lai TL, Shao QM: Self-Normalized Processes. Springer, Berlin; 2010.

13. Chandrasekharan K, Minakshisundaram S: Typical Means. Oxford University Press, Oxford; 1952.

14. Csörgő M, Szyszkowicz B, Wang Q: Donsker’s theorem for self-normalized partial sums processes. Ann. Probab. 2003, 31(3):1228–1240. 10.1214/aop/1055425777

15. Liu W, Lin ZY: Asymptotics for self-normalized random products of sums for mixing sequences. Stoch. Anal. Appl. 2007, 25: 739–762. 10.1080/07362990701419938

16. Seneta E: Regularly Varying Functions. Springer, Berlin; 1976.

17. Lin Z, Lu C: Limit Theory for Mixing Dependent Random Variables. Kluwer Academic, Boston; 1996.

18. Xue LG: Convergence rates of the strong law of large numbers for a mixing sequence. J. Syst. Sci. Math. Sci. 1994, 14: 213–221. (in Chinese)

19. Shao QM: Almost sure invariance principle for mixing sequences of random variables. Stoch. Process. Appl. 1993, 48: 319–334. 10.1016/0304-4149(93)90051-5

## Author information

Authors

### Corresponding author

Correspondence to Kyo-Shin Hwang.

### Competing interests

The author did not provide this information.

## Rights and permissions

Reprints and Permissions

Hwang, KS. On the almost sure central limit theorem for self-normalized products of partial sums of ϕ-mixing random variables. J Inequal Appl 2013, 155 (2013). https://doi.org/10.1186/1029-242X-2013-155

• Accepted:

• Published:

• DOI: https://doi.org/10.1186/1029-242X-2013-155

### Keywords

• almost sure central limit theorem
• ϕ-mixing
• domain of attraction of the normal law
• self-normalized product of partial sums
• strictly stationary 