- Research
- Open access
- Published:
Rates of convergence of lognormal extremes under power normalization
Journal of Inequalities and Applications volume 2016, Article number: 60 (2016)
Abstract
Let \(\{X_{n},n\geq1\}\) be an independent and identically distributed random sequence with common distribution F obeying the lognormal distribution. In this paper, we obtain the exact uniform convergence rate of the distribution of maxima to its extreme value limit under power normalization.
1 Introduction
Let \(\{X_{n},n\geq1\}\) be a sequence of independent and identically distributed random variables with common distribution function (df) \(F(x)\). Suppose that there exist constants \(a_{n}>0 \), \(b_{n}\in\mathbb{R}\) and a non-degenerate distribution \(G(x)\) such that
for all \(x\in C(G)\), the set of all continuity points of G, where \(M_{n}=\max_{1\leq i\leq n}X_{i}\) denotes the largest of the first n. Then \(G(x)\) must belong to one of the following three classes:
where α is one positive parameter. We say that F is in the max domain of attraction of G if (1.1) holds, denoted by \(F\in D_{l}(G)\). Criteria for \(F\in D_{l}(G)\) and the choice of normalizing constants \(a_{n}\) and \(b_{n}\) can be found in Galambos [1], Leadbetter et al. [2], Resnick [3], and De Haan and Ferreira [4].
The limit distributions of maxima under power normalization was first derived by Pancheva [5]. A df F is said to belong to the max domain of attraction of a non-degenerate df H under power normalization, written as \(F\in D_{p}(H)\), if there exist constants \(\alpha_{n}>0\) and \(\beta_{n}>0\) such that
where \(\operatorname{sign}(x)=-1,0\mbox{ or }1\) according to \(x<0\), \(x=0\) or \(x>0\). Pancheva [5] showed that H can be only of power type of the df’s, that is,
where α is a positive parameter. Necessary and sufficient conditions for F to satisfy (1.2) have been given by Christoph and Falk [6], Mohan and Ravi [7], Mohan and Subramanya [8] and Subramanya [9].
The logarithmic normal distribution (lognormal distribution for short) is one of the most widely applied distributions in statistics, biology, and some other disciplines. In this paper, we are interested in considering the uniform rate of convergence of (1.2) with \(X_{n}\) following the lognormal distribution. The probability density function of the lognormal distribution is given by
One interesting problem in extreme value analysis is to estimate the rate of uniform convergence of \(F^{n}(\cdot)\) to its extreme value distribution. For a power normalization, Chen et al. [10] derived the convergence rates of the distribution of maxima for random variables obeying the general error distribution. For convergence rates of distributions of extremes under linear normalization, see De Haan and Resnick [11] under second-order regular variation and for special cases see Hall [12] and Nair [13] for the normal distribution, which also is extended to those such as general error distribution, logarithmic general error distribution, see recent work of Peng et al. [14] and Liao and Peng [15]. For other related work on the convergence rates of some given distributions, see Castro [16] for the gamma distribution, Lin et al. [17] for the short-tailed symmetric distribution due to Tiku and Vaughan [18], and Liao et al. [19] for the skew normal distribution which extended the results of Nair [13]. The aim of this paper is to study the uniform and point-wise convergence rates of the distribution of power normalized maxima to its limits, respectively.
The contents of this article is organized as follows: some auxiliary results are given in Section 2. In Section 3, we provide our main results with related proofs deferred to Section 4.
2 Preliminaries
To prove our results, we first cite some results from Liao and Peng [15] and Mohan and Ravi [7].
In the sequel, let \(\{X_{n},n\geq1\}\) be a sequence of independent identically distributed random variables with common df F which follows the lognormal distribution. As before, let \(M_{n}=\max_{1\leq i\leq n}X_{i}\) represent the partial maximum of \(\{X_{n},n\geq1\}\). Liao and Peng [15] defined
and they obtained
From (2.2) we immediately derive \(F\in D_{l}(\Lambda)\). The following Mills ratio of the lognormal distribution is due to Liao and Peng [15]:
as \(x\rightarrow\infty\), where \(F'(x)\) is the density function of the lognormal distribution \(F(x)\). According to Liao and Peng [15], we have
for sufficiently large x, where \(c(x)\rightarrow(2\pi e)^{-1/2}\) as \(x\rightarrow\infty\), \(g(x)=1+(\log x)^{-2}\) and
Note that \(f'(x)\rightarrow0\) and \(g(x)\rightarrow1\) as \(x\rightarrow\infty\).
Lemma 2.1
[15]
Let F denote the lognormal distribution function. Then
for \(x>1\), where
and
In order to obtain the main results, we need the following two lemmas.
Lemma 2.2
[7]
Let F denote a df and \(r(F)=\sup\{x:F(x)<1\}\). Suppose that \(F\in D_{l}(\Lambda)\) and \(r(F)=\infty\), then \(F\in D_{p}(\Phi_{1})\), where normalizing constants \(\alpha_{n}=b_{n}\), \(\beta_{n}=a_{n}/b_{n}\).
Lemma 2.3
[7]
Let F denote a df, if \(F\in D_{p}(\Phi_{1})\) if and only if
-
(i)
\(r(F)>0\), and
-
(ii)
\(\lim_{t\uparrow r(F)}\frac{1-F( t\exp(y\bar{f}(t)))}{1-F(t)}=e^{-y}\), for some positive valued function f̄.
If (ii) holds for some f̄, then \(\int^{r(F)}_{a}((1-F(x))/x) \, \mathrm{d}x<\infty\) for \(0< a< r(F)\) and (ii) holds with the choice \(\bar{f}(t)=\int^{r(F)}_{t}((1-F(x))/x) \, \mathrm{d}x/(1-F(t))\). The normalizing constants may be chosen as \(\alpha_{n}=F^{\leftarrow}(1-1/n)\) and \(\beta_{n}=\bar{f}(\alpha_{n})\), where \(F^{\leftarrow}(x)=\inf\{y: F(y)\geq x\}\).
Theorem 2.1
Let \(\{X_{n},n\geq1\}\) be a sequence of independent identically distributed lognormal random variables. Then \(F\in D_{p}(\Phi_{1})\) and the normalizing constants can be chosen as \(\alpha^{*}_{n}=b_{n}\), \(\beta^{*}_{n}=a_{n}/b_{n}\), where \(a_{n}\) and \(b_{n}\) are given by (2.1).
Proof
Note that F follows the lognormal distribution, which implies \(F\in D_{p}(\Phi_{1})\) and \(\alpha^{*}_{n}=b_{n}\), \(\beta^{*}_{n}=a_{n}/b_{n}\) by Lemma 2.2, where \(a_{n}\) and \(b_{n}\) are defined by (2.1). □
By Lemma 2.3 and (2.3) and combining with Proposition 1.1(a) in [3], a natural way to choose constants \(\alpha_{n}\) and \(\beta_{n}\) is to solve the following equations:
and
where f is given by (2.4). The solution of (2.9) may be expressed as
and we easily check that \(\beta_{n}\sim(2\log n)^{-1/2}\).
3 Main results
In this section, we give two main results. Theorem 3.1 proves the result that the rate of uniform convergence of \(F^{n}(\alpha_{n}x^{\beta_{n}})\) to its extreme value limit is proportional to \(1/\log n\). Theorem 3.2 establishes the result that the point-wise rate of convergence of \(|M_{n}/\alpha_{n}|^{1/\beta_{n}}\operatorname{sign}(M_{n})\) to the extreme value df \(\exp(-x^{-1})\) is of the order of \(O(x^{-1}(\log x)^{2} e^{-1/x}(\log n)^{-1})\).
Theorem 3.1
Let \(\{X_{n},n\geq1\}\) denote an independent identically distributed random variables sequence with common df F following the lognormal distribution. Then there exist absolute constants \(0<\mathcal{C}_{1}<\mathcal{C}_{2}\) such that
for large \(n>n_{0}\), where \(\alpha_{n}\) and \(\beta_{n}\) are determined by (2.9) and (2.10), respectively.
Theorem 3.2
Let \(\alpha_{n}\) and \(\beta_{n}\) be given by (2.9) and (2.10). Then, for fixed \(x>0\),
as \(n\rightarrow\infty\).
4 Proofs
First of all, we provide the proof of Theorem 3.2, for it is relatively easy.
Proof of Theorem 3.2
By Lemma 2.1, we have
for \(x>0\), where \(T_{1}(x)=\frac{1}{\sqrt{2\pi}}(\log (\alpha_{n}x^{\beta_{n}}))^{-1}\exp(-\frac{(\log (\alpha_{n}x^{\beta_{n}}))^{2}}{2})\), \(T_{2}(x)=1-(\log (\alpha_{n}x^{\beta_{n}}))^{-2}\) and \(T_{3}(x)=\mathcal {S}(\alpha_{n}x^{\beta_{n}})\).
First, we calculate \(T_{1}(x)\). By (2.9) and (2.10), we have
Second, we estimate \(T_{2}(x)\) and \(T_{3}(x)\) for \(x>0\). By (2.10), we derive
and by Lemma 2.1 we have
Thus, we obtain
for large n and \(x>0\). We immediately get the result of Theorem 3.2 by (4.4). □
Proof of Theorem 3.1
By Theorem 3.2 we can prove that there exists an absolute constant \(\mathcal{C}_{1}\) such that
In order to obtain the upper bound for \(x>0\), we need to prove
for \(n>n_{0}\), where \(d_{i}>0\), \(i=1,2,3\) are absolute constants and
is positive for \(n>n_{0}\). By (2.9), we have
for \(n>n_{0}\).
First, consider the case of \(x\geq c_{n}\). Set
where \(\Psi_{n}(x)=1-F(\alpha_{n}x^{\beta_{n}})\) and \(A_{n}(x)\rightarrow1\), as \(x\rightarrow\infty\). We have
for \(n>n_{0}\). So,
Since
for \(0< x<1\), we obtain
for \(n>n_{0}\).
Hence, we have
for \(n>n_{0}\). Thus,
for \(n>n_{0}\). By (4.8), we have
for \(x\geq c_{n}\).
We now prove (4.5). By (2.9), (2.10), and the definition of \(A_{n}(x)\), we have
for \(x>1\). Since
for \(n>n_{0}\).
Combining (4.9) with (4.10), we have
Second, consider the situation of \(c_{n}\leq x<1\). By Lemma 2.1, we obtain
where \(0< q_{n}(x)<1\) and
Since \(e^{-x}>1-x\), as \(x>0\), we have
But
Hence, we obtain
for \(n>n_{0}\), where \(c_{n}\leq x<1\). Therefore,
for \(n\geq n_{0}\). Thus, there exists a positive number θ satisfying \(0<\theta<1\) such that
By (4.9) and (4.11), the proof of (4.6) is complete.
Third, consider the circumstance of \(0< x< c_{n}\). In this case
we have
The proof of Theorem 3.1 is finished. □
References
Galambos, J: The Asymptotic Theory of Extreme Order Statistics. Wiley, New York (1987)
Leadbetter, MR, Lindgren, G, Rootzén, H: Extremes and Related Properties of Random Sequences and Processes. Springer, New York (1983)
Resnick, SI: Extreme Value, Regular Variation and Point Processes. Springer, New York (1987)
De Haan, L, Ferreira, A: Extreme Value Theory: An Introduction. Springer, New York (2006)
Pancheva, E: Limit theorems for extreme order statistics under nonlinear normalization. In: Stability Problems for Stochastic Models. Lect. Notes Math., vol. 1155, pp. 284-309. Springer, Berlin (1985)
Christoph, G, Falk, M: A note on domains of attraction of p-max stable laws. Stat. Probab. Lett. 28, 279-284 (1996)
Mohan, NR, Ravi, S: Max domains of attraction of univariate and multivariate p-max stable laws. Theory Probab. Appl. 37, 632-643 (1993)
Mohan, NR, Subramanya, UR: Characterization of max domains of attraction of univariate p-max stable laws. In: Proceedings of the Symposium on Distribution Theory, Kochi, Kerala, India, pp. 11-24 (1991)
Subramanya, UR: On max domains of attraction of univariate p-max stable laws. Stat. Probab. Lett. 19, 271-279 (1994)
Chen, S, Wang, C, Zhang, G: Rates of convergence of extremes for general error distribution under power normalization. Stat. Probab. Lett. 82, 385-395 (2012)
De Haan, L, Resnick, SI: Second-order regular variation and rates of convergence in extreme value theory. Ann. Probab. 24, 97-124 (1996)
Hall, P: On the rate of convergence of normal extremes. J. Appl. Probab. 16, 433-439 (1979)
Nair, KA: Asymptotic distribution and moments of normal extremes. Ann. Probab. 9, 150-153 (1981)
Peng, Z, Nadarajah, S, Lin, F: Convergence rate of extremes for the general error distribution. J. Appl. Probab. 47, 668-679 (2010)
Liao, X, Peng, Z: Convergence rates of limit distribution of maxima of lognormal samples. J. Math. Anal. Appl. 395, 643-653 (2012)
Castro, LCE: Uniform rate of convergence in extreme-value theory: normal and gamma models. Ann. Sci. Univ. Clermont-Ferrand II, Probab. Appl. 6, 25-41 (1987)
Lin, F, Zhang, X, Peng, Z, Jiang, Y: On the rate of convergence of STSD extremes. Commun. Stat., Theory Methods 40, 1795-1806 (2011)
Tiku, ML, Vaughan, DC: A family of short-tailed symmetric distributions. Technical report, McMaster University, Canada (1999)
Liao, X, Peng, Z, Nadarajah, S, Wang, X: Rates of convergence of extremes from skew normal sample. Stat. Probab. Lett. 84, 40-47 (2014)
Acknowledgements
The authors would like to thank the Editor-in-Chief, the Associate Editor, and the referees for carefully reading the paper and for their comments, which greatly improved the paper. This work was supported by the National Natural Science Foundation of China (Grant No. 11171275, No. 71461027), the SWU Grant for Statistics PhD, the Science and Technology fund Project of GZ (Grant No. LKZS[2014]29), the Science and Technology Plan Project of Guizhou Province (Grant No. LH[2015]7001, No. LH[2015]7055), 2013, 2014, and 2015, and the Zunyi 15851 talents elite project funding and Zhunyi innovative talent team (Zunyi KH(2015)38).
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
JH obtained the theorem and completed the proof. SC and YL corrected and improved the final version. All authors read and approved the final manuscript.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Huang, J., Chen, S. & Liu, Y. Rates of convergence of lognormal extremes under power normalization. J Inequal Appl 2016, 60 (2016). https://doi.org/10.1186/s13660-016-0993-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13660-016-0993-4