- Research
- Open access
- Published:
Limit theorems for ratios of order statistics from uniform distributions
Journal of Inequalities and Applications volume 2019, Article number: 303 (2019)
Abstract
Let \(\{X_{ni}, 1 \leq i \leq m_{n}, n\geq 1\}\) be an array of independent random variables with uniform distribution on \([0, \theta _{n}]\), and \(\{X_{n(k)}, k=1, 2, \ldots , m_{n}\}\) be the kth order statistics of the random variables \(\{X_{ni}, 1 \leq i \leq m_{n}\}\). We study the limit properties of ratios \(\{R_{nij}=X_{n(j)}/X_{n(i)}, 1\leq i < j \leq m_{n}\}\) for fixed sample size \(m_{n}=m\) based on their moment conditions. For \(1=i < j \leq m\), we establish the weighted law of large numbers, the complete convergence, and the large deviation principle, and for \(2=i < j \leq m\), we obtain some classical limit theorems and self-normalized limit theorems.
1 Introduction
Let \(\{X_{ni}, 1 \leq i \leq m_{n}, n\geq 1\}\) be an array of independent random variables with uniform distribution on \([0, \theta _{n}]\), and \(\{X_{n(k)}, k=1, 2, \ldots , m_{n}\}\) be the kth order statistics of the random variables \(\{X_{ni}, 1 \leq i \leq m_{n}\}\) for given n. In contrast to the commonly used models of order statistics, such as the extreme values, the median, the sample range, and the linear functions of order statistics, some researchers are interested in the investigation of limit behaviors for the ratios of these order statistics
To derive the density function of \(R_{nij}\), we recall that the joint density function of \(X_{n(i)}\) and \(X_{n(j)}\) is
Let \(\omega =x_{i}\), \(r=x_{j}/x_{i}\). Then the Jacobian of the transformation is ω, so that the joint density function of \(X_{n(i)}\) and \(R_{nij}\) is
Therefore, the density function of the ratio \(R_{nij}\) is
which is not related to \(\theta _{n}\). Thus, for fixed i, j and the sample size \(m_{n}=m\), \(\{R_{nij}, n\geq 1\}\) is a sequence of independent identically distributed (i.i.d.) random variables, which allows us to obtain better limit behaviors for \(\{R_{nij}, n\geq 1\}\), although the random variables in different rows of the array \(\{X_{ni}, 1 \leq i \leq m_{n}, n\geq 1\}\) possess different uniform distributions.
In this paper, some generalized results of \(\{R_{nij}, n\geq 1\}\) for the fixed sample size \(m_{n}=m\) are established based on the works by Adler [1], Miao et al. [7], and Xu and Miao [11]. In Sect. 2, we first derive the moments of \(R_{nij}\), on which the limit theorems for \(R_{n1j}\) and \(R_{n2j}\) are established in Sects. 3 and 4, respectively.
Throughout this paper, let \(\log x =\ln \max \{e,x\}\), where “ln” is the natural logarithm. Denote by C a generic positive real number, which is not necessarily the same value in each appearance, and by \(I(B)\) the indicator function of set B.
2 Moments of \(R_{nij}\)
It is known from (1.1) that the density function of \(R_{n1j}\) is
where
and the density function of \(R_{n2j}\) is
where
Theorem 2.1
-
(I)
For \(1< j\leq m\), we have
$$ \textstyle\begin{cases} \mathbb{E} R_{n1j}^{\beta }< \infty , & 0 < \beta < 1, \\ \mathbb{E} R_{n1j}^{\beta }= \infty , & \beta \geq 1. \end{cases} $$ -
(II)
For \(2< j\leq m\), we have
$$ \textstyle\begin{cases} \mathbb{E} R_{n2j}^{\beta } < \infty , & 0 < \beta < 2, \\ \mathbb{E} R_{n2j}^{\beta } = \infty , & \beta \geq 2. \end{cases} $$ -
(III)
Let \(L(x)=\mathbb{E} R_{n2j}^{2} I(|R_{n2j}|\leq x)\)and \(\widetilde{L}(x)=\mathbb{E}(R_{n2j}-\mathbb{E} R_{n2j})^{2} I(|R_{n2j}- \mathbb{E} R_{n2j}|\leq x)\). Then both \(L(x)\)and \(\widetilde{L}(x)\)are slowly varying functions.
Proof
(I) It is easily known from (2.1) that \(f_{R_{n1j}}(r)\leq \gamma _{m,1,j}/r^{2}\) and \(f_{R_{n1j}}(r)\sim \gamma _{m,1,j}/r^{2}\) as \(r\to \infty \). Therefore, \(\mathbb{E} R_{n1j}^{\beta }< \infty \) for \(0 < \beta <1\) and \(\mathbb{E} R_{n1j}^{\beta }= \infty \) for \(\beta \geq 1\), respectively.
(II) Similarly, since \(f_{R_{n2j}}(r) \leq \gamma _{m,2,j}/r^{3}\) and \(f_{R_{n2j}}(r) \sim \gamma _{m,2,j}/r^{3}\) as \(r\to \infty \), then we have \(\mathbb{E} R_{n2j}^{\beta } < \infty \) for \(0 < \beta < 2\) and \(\mathbb{E} R_{n2j}^{\beta } = \infty \) for \(\beta \geq 2\), respectively.
(III) For any \(\lambda >0\),
It is not difficult to prove that \(L(\lambda x) \sim \gamma _{m,2,j} \log x \sim L(x)\) as \(x\to \infty \), implying that \(L(x)\) varies slowly at \(x\to \infty \), so does \(\widetilde{L}(x)\) as \(\widetilde{L}( \lambda x) \sim \gamma _{m,2,j}\log x \sim \widetilde{L}(x)\). □
Remark 2.1
With similar derivations, it can be obtained that the second moment of \(R_{nij}\) is finite for all \(3\leq i < j \leq m\). Therefore, a large number of classical limit properties of \(\{R_{nij}, n\geq 1\}\) for all \(3\leq i < j \leq m\) can be established easily. Hence, to study the limit behaviors for \(\{R_{nij}, n\geq 1\}\), we are only interested in \(\{R_{n1j}, n \geq 1\}\) and \(\{R_{n2j}, n \geq 1\}\).
3 Limit properties of \(R_{n1j}\)
According to part (I) of Theorem 2.1, \(\mathbb{E} R_{n1j} = \infty \). It follows that the classical strong law of large numbers for \(\{R_{n1j}, n \geq 1\}\) fails. Hence, we give the following weighted strong law of large numbers.
Theorem 3.1
Let \(\{a_{n}, n \geq 1\}\)and \(\{b_{n}, n \geq 1\}\)be two positive sequences satisfying the following conditions:
- (i)
\(b_{n}\)is nondecreasing, \(b_{n}\to \infty \)as \(n \to \infty \), and \(\sum_{n=1}^{\infty } \frac{a_{n}}{b_{n}} < \infty \);
- (ii)
\(\frac{1}{b_{N}}\sum_{n=1}^{N} a_{n} \log (\frac{b_{n}}{a _{n}} ) \to \lambda < \infty \)as \(N \to \infty \).
Then we have
where \(\gamma _{m,1,j}\)is the same number as that in (2.1).
Proof
Let \(c_{n}=b_{n}/a_{n}\). Then \(c_{n} \to \infty \) follows from condition (i). Without loss of generality, we assume that \(c_{n} \ge 1\) for all \(n\ge 1\). Notice the following partition:
where
Firstly, it can be established by condition (i) that
and for any \(0<\varepsilon < 1\),
Consequently, according to (3.2), the Khintchine–Kolmogorov convergence theorem (see Theorem 1 on page 113 of Chow and Teicher [2]), and Kronecker’s lemma, it is known that \(I_{N} \to 0\) almost surely. Furthermore, by condition (i) and Kronecker’s lemma, we have
Then it follows from (3.3) and the Borel–Cantelli lemma that \(\mathit{II}_{N} \to 0\) almost surely. Finally, since
it is obtained by condition (ii) that \(\mathit{III}_{N} \to \gamma _{m,1,j} \lambda \) almost surely.
The proof is then completed. □
Remark 3.1
In particular, let \(j=2\) in Theorem 3.1, we have
In this case, the result is independent of the sample size \(m_{n}\) because the density of \(R_{n12}\) is independent of \(m_{n}\). Thus, there is no difference between the assumption \(m_{n} \to \infty \) and for fixed \(m_{n}=m\). Furthermore, if we take \(b_{n}=(\log n )^{\alpha +2}\) and \(a_{n}= (\log n)^{\alpha }/n\), then for \(\alpha >-2\), the conditions of Theorem 3.1 hold and \(\lambda = 1/(\alpha +2)\), demonstrating that Theorem 3.1 in Adler [1] and Theorem 2.3 in Miao et al. [7] are special cases of Theorem 3.1.
Remark 3.2
Except for \(b_{n}=(\log n )^{\alpha +2}\), \(a_{n}= (\log n)^{\alpha }/n\) with \(\alpha >-2\), there are some other sequences satisfying the conditions of Theorem 3.1, for example, (i) \(b_{n}=n^{\beta }\), \(a_{n}= 1\) for \(\beta >1\); (ii) \(b_{n}=n (\log n)^{\beta }\), \(a_{n}= 1\) for \(\beta >1\); and (iii) \(b_{n}=(\log n)^{2}(\log \log n)^{ \beta }\), \(a_{n}= (\log \log n)^{\beta }/n\) for all \(\beta \in R\). As a result, Theorem 2.18 in Miao et al. [7] is also a special case of Theorem 3.1.
To establish the complete convergence of the ratio \(R_{n1j}\), we first introduce the lemma obtained by Sung et al. [10].
Lemma 3.1
Let \(\{X_{ni},1 \leq i \leq k_{n} ,n \geq 1\}\)be an array of row-wise independent random variables and \(\{\mu _{n}, n \geq 1\}\)be a sequence of positive constants with \(\sum_{n=1}^{\infty } \mu _{n} = \infty \). Suppose that, for every \(\varepsilon >0\)and some \(\delta >0\),
- (i)
\(\sum_{n=1}^{\infty } \mu _{n} \sum_{i=1}^{k_{n}} P(|X_{ni}| > \varepsilon ) < \infty \),
- (ii)
there exists \(J \geq 2 \)such that
$$ \sum_{n=1}^{\infty } \mu _{n} \Biggl(\sum_{i=1}^{k_{n}} \mathbb{E} X _{ni}^{2} I\bigl( \vert X_{ni} \vert \leq \delta \bigr) \Biggr)^{J} < \infty , $$ - (iii)
\(\sum_{i=1}^{k_{n}} \mathbb{E} X_{ni} I(|X_{ni}| \leq \delta ) \to 0\)as \(n \to \infty \).
Then it is true that
Theorem 3.2
Let \(\{a_{n}, n \geq 1\}\), \(\{b_{n}, n \geq 1\}\), and \(\{\mu _{n}, n \geq 1\}\)be three positive sequences such that
- (i)
\(b_{n}\)is nondecreasing, \(b_{n}\to \infty \)as \(n \to \infty \), and \(\sum_{n=1}^{\infty } \frac{a_{n}}{b_{n}} < \infty \);
- (ii)
\(\frac{1}{b_{N}}\sum_{n=1}^{N} a_{n} \log (\frac{b_{n}}{a _{n}} ) \to \lambda < \infty \)as \(N \to \infty \);
- (iii)
\(\sum_{n=1}^{\infty } \mu _{n} = \infty \)and \(\sum_{N=1}^{ \infty } \mu _{N}\sum_{n=1}^{N} \frac{a_{n}}{b_{n}} < \infty \).
Then, for every \(\varepsilon >0\), we have
where \(\gamma _{m,1,j}\)is the same number as that in (2.1).
Proof
Let \(c_{n}=b_{n}/a_{n}\). Then \(c_{n} \to \infty \) according to condition (i). Without loss of generality, we assume that \(c_{n} \ge 1\) for all \(n\ge 1\). Notice the following partition:
where
According to condition (i), Markov’s inequality, and the following inequality
we have, for any \(\varepsilon >0\), that
Moreover, with condition (ii) and the following estimation
we obtain
It follows from (3.5) and condition (iii) that
Next, denote \(X_{Nn}=(a_{n}/b_{N} )R_{n1j} I(R_{n1j} > c_{n} )\). For any \(\varepsilon > 0\), we have
where \(\xi =\max \{ c_{n}, b_{N} \varepsilon /a_{n} \} \). It follows from condition (iii) that
Then, for any \(\delta >0 \), the following partial sum of truncated second moment holds:
which is bounded according to condition (i). This, combined with condition (iii), leads to the inequality that, for every \(J \geq 2\),
Noting that (1) \(\frac{1}{b_{N}}\sum_{n=1}^{N} a_{n} \to 0\) as \(N \to \infty \) by condition (i) and Kronecker’s lemma; and (2) \(\log (b_{N} \delta /b_{n} )\) is bounded by logC via taking \(\delta =C b_{1}/b_{N}\), we obtain, for any \(\delta >0\), that
Thus, from Lemma 3.1 and (3.7)–(3.9), we have
The combination of (3.4), (3.6), and (3.10) yields the desired result. □
Remark 3.3
As pointed out in Remark 3.2, the conditions of Theorem 3.2 are also easy to be satisfied. For example, it can be easily proved that the following sequences all satisfy the conditions of Theorem 3.2: (1) \(b_{n}=n^{\beta }\), \(a_{n}= 1\), \(\mu _{n}=1/n\), \(\beta >1\); (2) \(b_{n}=n (\log n)^{\beta }\), \(a_{n}= 1\), \(\mu _{n}=1/[n( \log )^{\delta }]\), \(\beta >1\), \(\delta >0\); (3) \(b_{n}=(\log n)^{2}( \log \log n)^{\beta }\), \(a_{n}= (\log \log n)^{\beta }/n\), \(\mu _{n}=1/[n( \log )^{\delta }]\), \(\beta \in R\), \(\delta >0\); and (4) \(b_{n}=( \log n )^{\alpha +2}\), \(a_{n}= (\log n)^{\alpha }/n\), \(\mu _{n}=1/[n( \log )^{\delta }]\), \(\alpha >-2\), \(\delta >0\). When the sequences are taken to be those in (4), Theorem 3.2 becomes Theorem 2.1 and Theorem 2.6 in Xu and Miao [11], respectively.
In what follows, we give the weighted weak law of large numbers.
Theorem 3.3
Let \(\{a_{n}, n \geq 1\}\)and \(\{b_{n}, n \geq 1\}\)be two positive sequences satisfying the following conditions:
- (i)
\(\frac{1}{b_{N}}\sum_{n=1}^{N} a_{n} \to 0\)as \(N \to \infty \);
- (ii)
\(\frac{1}{b_{N}}\sum_{n=1}^{N} a_{n} \log (\frac{b_{N}}{a _{n}} ) \to \tilde{\lambda } < \infty \)as \(N \to \infty \).
Then
where \(\gamma _{m,1,j}\)is the same number as that in (2.1).
Proof
We use the so-called weak law (see Theorem 1 on page 356 of Chow and Teicher [2]) for the proof of Theorem 3.3. From condition (i), we have the following two inequalities:
and, for any \(\varepsilon >0\),
which is sufficient to prove that the weighted sum converges to some real number.
Next, using condition (i) again, we have
It follows from condition (ii) that
The proof is then completed. □
In particular, let \(L(x)\) be a slowly varying function. If we take \(a_{n}= n^{\alpha } L(n)\) and \(b_{n}=n^{\alpha +1}L(n)\log n\), in which \(\alpha >-1/(j-1)\) and \(j\geq 2\), the conditions of Theorem 3.3 hold with \(\tilde{\lambda }=1/(\alpha +1)\). This result will be described as the following corollary, whose proof explains how we verify the conditions.
Corollary 3.1
Let \(L(x)\)be a slowly varying function. Then, for any \(\alpha >-1/(j-1)\), we have
where \(\gamma _{m,1,j}\)is the same number as that in (2.1).
Proof
Denote \(U(x)=x^{\gamma } L(x)\) and \(U_{p}(x)=\int _{0}^{x}t^{p} U(t) \,dt\). \(U(x)=x^{\gamma } L(x)\) varies regularly with the exponent γ as \(L(x)\) is slowly varying. Therefore, according to part (b) of Theorem 1 on page 281 of Feller [3], for \(p \geq - \gamma -1\), it is true that
which implies that
It is easy to check that \(L(x)\log x\), \(L(x)\log L(x)\), and \(L(x)^{l}\) for all \(l\in N\) are slowly varying functions as \(L(x)\) is slowly varying. So, we derive the following similar estimates as (3.12):
Let \(a_{n}= n^{\alpha } L(n)\), \(b_{n}=n^{\alpha +1}L(n)\log n\). We once again use the so-called weak law, which can be found on page 356 of Chow and Teicher [2]. By (3.12), it is easy to obtain that, for any \(\varepsilon >0\),
and
which are sufficient to prove that the weighted sum converges to some real number.
Next, notice that
In addition, by (3.12) and (3.13), we have
Therefore
where \(\alpha >-1/(j-1)\). The proof is then completed. □
Remark 3.4
It is easy to check that the corresponding strong law of numbers of Theorem 3.3 fails. Hence, the weak law of large numbers is optimal in this case. Corollary 3.1 extends Theorem 3.2 in Adler [1], which proved the same result for \(R_{n12}\) of the sample from the uniform distribution \(U(0, p)\).
4 Limit properties of \(R_{n2j}\)
The following Marcinkiewicz–Zygmund type law of large numbers extends the result for \(R_{n23}\) in Xu and Miao [11, Theorem 2.3].
Theorem 4.1
For any \(\delta \in (0,2)\), we have
for some finite constantc, wherectakes the value \(\mathbb{E} R _{n2j}\)for \(\delta \in [1,2)\)andcis arbitrary for \(\delta \in (0,1)\). In particular, the Kolmogorov type strong law of large numbers holds when \(\delta =1\).
Proof
By part (II) of Theorem 2.1 and the Marcinkiewicz–Zygmund theorem (see Theorem 2 on page 125 in Chow and Teicher[2]), the theorem can be proved. □
The following central limit theorem and its almost sure version extend the responding result for \(R_{n23}\) established by Miao et al. [7, Theorem 2.12] and Xu and Miao [11, Theorem 2.4], respectively.
Theorem 4.2
Let \(L(x)=\mathbb{E} R^{2}_{n2j} I(|R_{n2j}| \leq x)\). Then
where \(\eta _{N} = 1 \lor \sup \{x >0; N L(x) \geq x^{2}\}\)and \(\varPhi (x)\)denotes the standard normal distribution function.
Proof
From Theorem 4.17 of Kallenberg [6, page 73], it is known that \(L(x)=\mathbb{E} R^{2}_{n2j} I(|R_{n2j}| \leq x)\) varies slowly at \(x\to \infty \). Therefore, the theorem is proved. □
Theorem 4.3
Let \(L(x)=\mathbb{E} R^{2}_{n2j} I(|R_{n2j}| \leq x)\). Then, for any real numberx, we have
where \(S_{n}=\sum_{i=1}^{n} Y_{i} \), \(Y_{n}= R_{n2j}-\mathbb{E} R_{n2j}\), \(\eta _{n} = 1 \lor \sup \{x >0; n L(x) \geq x^{2}\}\)and \(\varPhi (x)\)denotes the standard normal distribution function.
The proof of Theorem 4.3 is very similar to that of Xu and Miao [11, Theorem 2.4], so we omit it. For the same reason, the following large deviation principle, which extends the corresponding result built by Xu and Miao [11, Theorem 2.5], is given without proof.
Theorem 4.4
For any \(x>1/2\),
As aforementioned, the βth moment of \(R_{n2j}\) does not exist for any \(\beta \geq 2\). Thus, some other classical limit properties, such as the law of iterated logarithm, do not hold. As is well known, the limit theorems for self-normalized sum usually require much less stringent moment conditions than the classical limit theorems. Therefore, we are interested in the investigation of limit behaviors for the self-normalized sum \(S_{N}/V_{N}\), where \(S_{N}=\sum_{n=1}^{N} (R _{n2j}-\mathbb{E} R_{n2j})\) and \(V_{N}^{2}=\sum_{n=1}^{N} (R_{n2j}- \mathbb{E} R_{n2j})^{2}\). Notice that \(\{R_{n2j}-\mathbb{E} R_{n2j}, n \geq 1 \}\) is a sequence of i.i.d. random variables with mean zero. In addition, the distribution of \(R_{n2j}-\mathbb{E} R_{n2j}\) is in the domain of attraction of the normal law, because \(\widetilde{L}(x)= \mathbb{E}(R_{n2j}-\mathbb{E} R_{n2j})^{2} I(|R_{n2j}-\mathbb{E} R _{n2j}|\leq x)\) is slowly varying as \(x\to \infty \). Thus many self-normalized limit properties can be directly established as the corollaries of the corresponding well-known results. We list some of them without the proofs.
From Theorem 3.3 in Gine et al. [4], we obtain the following self-normalized central limit theorem.
Theorem 4.5
\(S_{N}/V_{N}\xrightarrow{d} \varPhi (x)\)as \(N \to \infty \).
The following self-normalized almost sure central limit theorem can be obtained by Corollary 1 in Zhang [12].
Theorem 4.6
Denote \(d_{n}=\exp \{(\log n)^{\alpha }\}/n\)and \(D_{N}=\sum_{n=1} ^{N} d_{n}\)for \(1\leq \alpha < \frac{1}{2}\). Then, for any realx, we have
From Theorem 3 in Robinson and Wang [8], we have the following self-normalized Berry–Esseen bounds.
Theorem 4.7
Let \(\eta _{N}=\sup \{x:Nx^{-2} \mathbb{E} R_{n2j}^{2} I(|R_{n2j}| \leq x) \geq 1\}\). Then there exists \(0<\eta <1\)such that
for all \(x\in R\)and \(N\geq 1\), where
andAis an absolute constant.
From Theorem 3.1 in Shao [9], we obtain the following self-normalized moderate deviation principle.
Theorem 4.8
Let \(\{x_{N}, N\geq 1\}\) be a sequence of positive numbers satisfying
Then
The following self-normalized law of the iterated logarithm can be established by Theorem 3.1 in Griffin and Kuelbs [5].
Theorem 4.9
It is true that
References
Adler, A.: Laws of large numbers for ratios of uniform random variables. Open Math. 13(1), 571–576 (2015)
Chow, Y.S., Teicher, H.: Probability Theory: Independence, Interchangeability, Martingales, 3rd edn. Springer, New York (1997)
Feller, W.: An Introduction to Probability Theory and Its Applications, 3rd edn. Wiley, New York (1971)
Giné, E., Götze, F., Mason, D.M.: When is the Student t-statistic asymptotically standard normal? Ann. Probab. 25(3), 1514–1531 (1997)
Griffin, P.S., Kuelbs, J.D.: Self-normalized laws of the iterated logarithm. Ann. Probab. 17(4), 1571–1601 (1989)
Kallenberg, O.: Foundations of Modern Probability. Springer, New York (1997)
Miao, Y., Sun, Y., Wang, R., Dong, M.: Various limit theorems for ratios from the uniform distribution. Open Math. 14(1), 393–403 (2016)
Robinson, J., Wang, Q.: On the self-normalized Cramér-type large deviation. J. Theor. Probab. 18(4), 891–909 (2005)
Shao, Q.M.: Self-normalized large deviations. Ann. Probab. 25(1), 285–328 (1997)
Sung, S.H., Volodin, A.I., Hu, T.C.: More on complete convergence for arrays. Stat. Probab. Lett. 71(4), 303–311 (2005)
Xu, S.F., Miao, Y.: Some limit theorems for ratios of order statistics from uniform random variables. J. Inequal. Appl. 2017, 295, 1–18 (2017)
Zhang, Y.: A general result on almost sure central limit theorem for self-normalized sums for mixing sequences. Lith. Math. J. 53(4), 471–483 (2013)
Acknowledgements
The authors greatly appreciate both the editor and the referees for their valuable comments and some helpful suggestions that improved the clarity and readability of this paper.
Funding
This work was supported by the National Natural Science Foundation of China (11871056, 11471104).
Author information
Authors and Affiliations
Contributions
All authors contributed equally and significantly in writing this article. All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare that they have no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Xu, S., Mei, C. & Miao, Y. Limit theorems for ratios of order statistics from uniform distributions. J Inequal Appl 2019, 303 (2019). https://doi.org/10.1186/s13660-019-2256-7
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13660-019-2256-7