- Research
- Open access
- Published:
Further Spitzer’s law for widely orthant dependent random variables
Journal of Inequalities and Applications volume 2021, Article number: 183 (2021)
Abstract
The Spitzer’s law is obtained for the maximum partial sums of widely orthant dependent random variables under more optimal moment conditions.
1 Introduction and main results
It is well known that the sample mean, based on a sequence of independent random variables with common distribution, is a strongly consistent estimator for the population mean. This property is the Kolmogorov strong law of large numbers. The strong law of large numbers is one of key results of modern probability theory.
For the independent case, the Kolmogorov strong law of large numbers is obtained by the Kolmogorov three-series theorem, which is based on the Kolmogorov inequality or the moment inequality for the maximum partial sums, see Petrov [11] for more detail. For some dependent cases, such as negatively associated random variables and \(\rho ^{*}\)-mixing random variables, the moment inequalities for the maximum partial sums are obtained by Matula [10] and Utev and Peligrad [14], respectively.
For the pairwise independent case, the moment inequality for the maximum partial sums is not known. Etemadi [6] used a subsequence method to prove the Kolmogorov strong law of large numbers. Matula [10] extended it to pairwise negatively quadrant dependent (NQD) random variables. Using Etemadi’s subsequence method, Kruglov [8] gave a sufficient condition for the Kolmogorov strong law of large numbers for nonnegative random variables as follows.
Theorem A
Let \(\{X_{n},n\geq 1\}\) be a sequence of nonnegative random variables, and let \(\{b_{n},n\geq 1\}\) be a bounded sequence of nonnegative numbers. Suppose that
Then the Kolmogorov strong law of large numbers
Unfortunately, (1.1) does not imply (1.2) without the nonnegative random variable assumption, see Remarks 1.1 and 1.2 in Bai et al. [1]. However, if (1.1) is improved to
then (1.3) implies (1.2) without any assumptions. This fact can be proved by a subsequence method (see, for example, Chen et al. [2]), which is different from Etemadi’s subsequence method. Spitzer [13] first established (1.1) for independent and identically distributed (i.i.d.) random variables \(\{X_{n}\}\) with \(b_{n}=EX_{1}\) by using the combinatorial method. One can label (1.3) as Spitzer’s law for maximum partial sums of random variables. In the independent case, (1.1) and (1.3) are equivalent. In some dependent cases, for example, pairwise independent case, (1.1) is strictly weaker than (1.3), see Bai et al. [1]. So, it is more important to obtain (1.3) than (1.1).
Let us recall the concept of widely orthant dependent (WOD) random variables which was introduced by Wang et al. [15] as follows.
Definition 1.1
A sequence of random variables \(\{X_{n}, n\geq 1\}\) is said to be widely upper orthant dependent (WUOD) if, for each \(n\ge 1\), there exists a positive number \(g_{U}(n)\) such that, for all real numbers \(x_{k}\), \(1\le k\le n\),
It is said to be widely lower orthant dependent (WLOD) if, for each \(n\ge 1\), there exists a positive number \(g_{L}(n)\) such that, for all real numbers \(x_{k}\), \(1\le k\le n\),
and it is said to be WOD if it is both WUOD and WLOD.
Wang et al. [15] provided many examples of WLOD and WUOD random variables. They also provided WOD random variables which are neither positively dependent nor negatively dependent.
In Definition 1.1, \(g_{U}(n)\), \(g_{L}(n)\), \(n\ge 1\), are called dominating coefficients. The sequence \(\{X_{n}, n\ge 1\}\) is said to be extended negatively dependent (END) if \(g_{U}(n)=g_{L}(n)=M\) for all \(n\ge 1\) and for some positive constant M. In particular, \(\{X_{n}, n\ge 1\}\) is said to be negatively orthant dependent (NOD) or negatively dependent if \(M=1\). When \(n=2\) and \(g_{U}(2)=g_{L}(2)=1\), \(X_{1}\) and \(X_{2}\) are said to be NQD, and \(X_{1}\) and \(X_{2}\) are independent if the inequalities are replaced with the equalities.
Since the class of WOD random variables contains END random variables and NOD random variables as special cases, it is interesting to study the limiting behavior of WOD random variables. In fact, a number of authors studied the limiting behavior of WOD random variables. We refer to Wang and Cheng [16] for renewal theory, to Wang et al. [17] for risk theory, to Chen et al. [5] and Lang et al. [9] for complete convergence, and to Guan et al. [7] for complete moment convergence.
Recently, Chen and Sung [3] obtained Spitzer’s law for maximum partial sums of WOD random variables as follows.
Theorem B
Let \(\{ X_{n}, n\ge 1\}\) be a sequence of WOD random variables with the same distribution as X, and with dominating coefficients \(g_{L}(n)\), \(g_{U}(n)\) for \(n\ge 1\). Suppose that there exists a nondecreasing positive function \(g(x)\) on \([0, \infty )\) such that \(\max \{g_{L}(n), g_{U}(n)\}\le g(n)\) for \(n\ge 1\). Assume that one of the following conditions holds.
-
(I)
\(g(x)/x^{\tau }\downarrow \) for some \(0<\tau <1\), and \(E \vert X \vert g( \vert X \vert )<\infty \).
-
(II)
There exists a nondecreasing positive function \(h(x)\) on \([0, \infty )\) such that \(h(x)/x \downarrow \), \(\sum_{n=1}^{\infty }g(n)/(nh^{\gamma }(n))<\infty \) for some \(\gamma >0\), and \(E \vert X \vert h( \vert X \vert )<\infty \).
Then
Note that Chen and Sung [4] generalized Theorem B(I) to weighted sums \(\sum_{k=1}^{m} a_{nk}X_{k}\), where the weights \(\{a_{nk}\}\) satisfy \(\sum_{k=1}^{n} \vert a_{nk} \vert ^{\alpha }=O(n)\) for some \(\alpha \ge 2\).
When \(g(x)=\max \{1, x^{\alpha }\}\) (or \(\log ^{\alpha }x\)) for some \(\alpha >0\), the moment condition in (II) is \(E \vert X \vert ^{1+\delta }<\infty \) (or \(E \vert X \vert \log ^{\delta } \vert X \vert <\infty \), correspondingly) for some \(\delta >0\), which is weaker than that in (I), where, and in the following, \(\log x=\log _{e} \max \{x, e\}\). When \(g(x)=(\log \log x)^{\alpha }\) for some \(\alpha >0\), the moment condition in (I) is \(E \vert X \vert (\log \log \vert X \vert )^{\alpha }<\infty \). In this paper, we improve the moment condition when \(g(x)=\max \{1, x^{\alpha }\}\), \(\log ^{\alpha }x\), \((\log \log x)^{\alpha }\) for some \(\alpha >0\). In fact, we provide a more optimal moment condition when g is a fairly general function containing above functions.
Now we state the main results. Some preliminary lemmas and the proofs of the main results will be detailed in Sect. 2. Some examples satisfying the main results will be presented in Sect. 3.
Theorem 1.1
Let \(\{X_{n}, n\ge 1\}\) be a sequence of WOD random variables with the same distribution as X and with dominating coefficients \(g_{L}(n)\), \(g_{U}(n)\) for \(n\ge 1\). Suppose that there exist nondecreasing positive functions \(g(x)\) and \(h(x)\) on \([0, \infty )\) such that \(\max \{g_{L}(n), g_{U}(n)\}\le g(n)\) for \(n\ge 1\), \(g(x) =O(x^{\alpha })\) for some \(\alpha >0\), \(x/h(x) \uparrow \infty \),
and
If \(E \vert X \vert h( \vert X \vert )<\infty \), then (1.4) holds.
Theorem 1.2
Let \(\{X_{n}, n\ge 1\}\) be a sequence of WOD random variables with the same distribution as X and with dominating coefficients \(g_{L}(n)\), \(g_{U}(n)\) for \(n\ge 1\). Suppose that there exists a nondecreasing positive function \(g(x)\) on \([0, \infty )\) such that \(\max \{g_{L}(n), g_{U}(n)\}\le g(n)\) for \(n\ge 1\), \(g(x)/x^{\tau }\downarrow \) for some \(0<\tau <1\), and
If \(E \vert X \vert \sqrt{g( \vert X \vert )}<\infty \), then (1.4) holds.
Remark 1.1
Chen and Sung [3, 4] proved Theorem 1.2 without condition (1.7). However, the moment condition of Chen and Sung [3, 4] is \(E \vert X \vert g( \vert X \vert )<\infty \), which is stronger than \(E \vert X \vert \sqrt{g( \vert X \vert )}<\infty \). Under the stronger moment condition \(E \vert X \vert g( \vert X \vert )<\infty \), Chen and Sung [4] generalized Theorem 1.2 to weighted sum \(\sum_{k=1}^{n} a_{nk}X_{k}\) satisfying \(\sum_{k=1}^{n} \vert a_{nk} \vert ^{\alpha }=O(n)\) for some \(\alpha >0\), where \(0<\tau <\min \{1,\alpha \}\). When \(a_{nk}=1\) for \(1\le k\le n\) and \(n\ge 1\), \(\sum_{k=1}^{n} \vert a_{nk} \vert ^{\alpha }=O(n)\) for any \(\alpha >0\), and hence the condition \(0<\tau <\min \{1,\alpha \}\) reduces to \(0<\tau <1\).
Remark 1.2
It is important to find functions satisfying (1.5) or (1.7). A number of slowly varying functions, for example, logx, \(\log \log x\), and \(\log \log \log x\), satisfy (1.5). However, there exist slowly varying functions which do not satisfy (1.5). Such functions are \(\exp (\log x/\log \log x)\) and \(\exp (\log x/\log \log \log x)\). Further, there exists a non-slowly varying function satisfying (1.5). Define \(h(x)\) by
Then we have that
which implies that \(\limsup_{x\to \infty } h(x)/h(x/h(x))=2\). Hence \(h(x)\) satisfies (1.5). On the other hand, we also have that
which implies that \(h(x)\) is not a slowly varying function.
Corollary 1.1
Let \(\{X_{n}, n\ge 1\}\) be a sequence of WOD random variables with the same distribution as X and with dominating coefficients \(g_{L}(n)\), \(g_{U}(n)\) for \(n\ge 1\). Suppose that there exists a nondecreasing positive function \(g(x)\) on \([0, \infty )\) such that \(\max \{g_{L}(n), g_{U}(n)\}\le g(n)\) for \(n\ge 1\). Assume that one of the following conditions holds.
-
(I)
\(g(x)=O(x^{\alpha })\) for some \(\alpha >0\), and \(E \vert X \vert \log \vert X \vert <\infty \).
-
(II)
\(g(x)=O((\log x)^{\alpha })\) for some \(\alpha >0\), and \(E \vert X \vert \log \log \vert X \vert <\infty \).
-
(III)
\(g(x)=O((\log \log x)^{\alpha })\) for some \(\alpha >0\), and \(E \vert X \vert (\log \log \vert X \vert )^{\alpha /2}<\infty \).
Then (1.4) holds.
Theorem 1.1 can be generalized as follows.
Theorem 1.3
Let \(r\ge 1\). Let \(\{X_{n}, n\ge 1\}\), \(g(x)\), and \(h(x)\) be as in Theorem 1.1except (1.6). Assume, instead of (1.6), that
If \(E( \vert X \vert h( \vert X \vert ))^{r}<\infty \), then
Theorem 1.2 can also be generalized as follows.
Theorem 1.4
Let \(1\le r<2\). Let \(\{X_{n}, n\ge 1\}\) and \(g(x)\) be as in Theorem 1.2except \(0<\tau <1\). If \(0<\tau <2-r\) and \(E( \vert X \vert \sqrt{g( \vert X \vert )})^{r}<\infty \), then (1.8) holds.
Remark 1.3
When \(r=1\), Theorems 1.3 and 1.4 reduce to Theorems 1.1 and 1.2, respectively. We can consider the following more general result:
where \(\theta >1/2\). If \(\theta =1\), this reduces to (1.8). However, the proofs of Theorems 1.3 and 1.4 cannot be applied to this case.
Throughout this paper, the symbol C denotes a positive constant which is not necessarily the same one in each appearance. For an event A, \(I(A)\) denotes the indicator function of the event A.
2 Lemmas and proofs
To prove the main results, we need the following lemmas. The following lemma is a Fuk–Nagaev type inequality for WOD random variable which is due to Shen et al. [12].
Lemma 2.1
Let \(\{Y_{n},n\geq 1\}\) be a sequence of WOD random variables with dominating coefficients \(g_{L}(n)\), \(g_{U}(n)\) for \(n\ge 1\). If \(EY_{n}=0\) and \(EY^{2}_{n}<\infty \) for \(n\geq 1\), then, for any \(n\ge 1\), \(x>0\), and \(y>0\),
where \(M_{n}=\sum^{n}_{k=1}EY^{2}_{k}\) and \(g(n)=\max \{g_{L}(n), g_{U}(n)\}\).
The following lemma is simple, but it is useful for proving Theorem 1.1.
Lemma 2.2
Let \(f(x)\) be a positive function defined on \([0, \infty )\) such that \(f(x) \uparrow \) and \(x/f(x) \uparrow \infty \). If \(\limsup_{x\to \infty } f(x)/f(x/f(x))<\infty \), then \(\limsup_{x\to \infty } f(x)/f(\gamma x/f(x))<\infty \) for any \(\gamma >0\).
Proof
If f is bounded, then the result is obvious. We now assume that \(0< f(x) \uparrow \infty \). Let \(D=\sup_{x\ge 0} f(x)/f(x/f(x))\). Since \(0< f(x) \uparrow \infty \), there exists \(x_{0}\) such that \(f(x)\ge 1/\gamma \) if \(x\ge x_{0}\). It follows that, for \(x\ge x_{0}\),
and hence, for \(x/f(x)\ge x_{0}\),
which proves the result. □
Proof of Theorem 1.1
Without loss of generality, we can assume that \(X\geq 0\) a.s. Let \(\varepsilon >0\) be given. Since \(EX<\infty \) and \(EXh(X)<\infty \), there exists \(A>0\) such that \(\max \{EXI(X>A), EXh(X)I(X>A)\}<\min \{\varepsilon /8, \varepsilon ^{2}/(64 \delta )\} \). Let
Then \(X_{k}=Y_{k}+Z_{k}\). Hence, to prove (1.4), it suffices to prove that
and
By the same argument as \(I_{1}<\infty \) in Chen and Sung [3], (2.2) holds. So we only need to prove (2.1). Noting that \(EY_{k}\leq EXI(X>A)<\varepsilon /8 \), we have that
Therefore, to prove (2.1), it suffices to prove that
For \(n>A\) and \(1\le k\le n\), let
Then \(EW_{nk}\leq EXI(X>A)<\varepsilon /8\) and \(W_{nk}=Y_{k}\) if \(X_{k}\le n\). It follows that
So, to prove (2.3), it is enough to show that
since \(\sum^{\infty }_{n=1}n^{-1}P\{\bigcup^{n}_{k=1}\{X_{k}>n\}\}\leq \sum^{\infty }_{n=1}P\{X>n\}\leq EX<\infty \).
Since \(h(x)\uparrow \) and \(x/h(x)\uparrow \), we have that, for \(n>A\),
Note that \(\{W_{nk}, 1\le k\le n\}\) is a sequence of WOD, since each \(W_{nk}\) is a monotone transformation of \(X_{k}\) (see Proposition 1.1 in Wang et al. [15]). Then we have by Lemma 2.1 with \(x=\varepsilon n/4\) and \(y=\varepsilon n /(16 \delta h(n))\) that
Since \(n/h(n)\uparrow \infty \), we have that \(\varepsilon n/(16\delta h(n))-\varepsilon /8> \varepsilon n/(17 \delta h(n))\) for all large n. We also have by (1.5) and Lemma 2.2 that \(\sup_{n\ge 1} h(n)/h(\varepsilon n/(17 \delta h(n))\le D\) for some constant \(D>0\). It follows that
Therefore, (2.4) holds by (2.5), (2.6), and (1.6). The proof is completed. □
Proof of Theorem 1.2
Without loss of generality, we can assume that \(X\geq 0\) a.s. Let \(\varepsilon >0\) be given. Since \(EX<\infty \), there exists \(A>0\) such that \(EXI(X>A)<\varepsilon /8 \). Let
Then \(X_{k}=Y_{k}+Z_{k}\). As in the proof of Theorem 1.1, it suffices to prove that
For \(n/\sqrt{g(n)}>A\) and \(1\le k\le n\), let
Then \(EW_{nk}\leq EXI(X>A)<\varepsilon /8\) and \(W_{nk}=Y_{k}\) if \(X_{k}\le n/\sqrt{g(n)}\). It follows that
By (1.7) and \(0< g(x)\uparrow \), we have that \(\sup_{n\ge 1} g(n)/g(n/\sqrt{g(n)})\le \sup_{n\ge 1} g(n)/g(n/g(n)) \le D\) for some constant \(D>0\) (note that \(g(n)\ge 1\), since \(\max \{g_{L}(n), g_{U}(n)\}\le g(n)\)). Then
It remains to prove that
where \(\sum_{n=1, n/\sqrt{g(n)}>A}^{\infty }\) means that the summation is taken over all positive integer n satisfying \(n/\sqrt{g(n)}>A\). The rest of the proof is similar to that of Chen and Sung [3]. We have by Markov’s inequality and Lemma 2.6 in Chen and Sung [3] that
Interchanging the order of summation, we have that
We also have by the proof of (2.7) that
Hence (2.8) holds by (2.9)–(2.11). The proof is completed. □
Proof of Corollary 1.1
If (I) holds, then all conditions of Theorem 1.1 are satisfied provided that \(h(x):=\log x\). If (II) holds, then all conditions of Theorem 1.1 are satisfied provided that \(h(x):=\log \log x\). If (III) holds, then all conditions of Theorem 1.2 are satisfied provided that \(g(x):=C(\log \log x)^{\alpha }\). Hence the result follows from Theorems 1.1 and 1.2. □
Proof of Theorem 1.3
The proof is the same as that of Theorem 1.1 and is omitted. □
Proof of Theorem 1.4
The proof is the same as that of Theorem 1.2 and is omitted. □
3 Examples
In this section, we provide two examples satisfying Theorem 1.1 or Theorem 1.2.
Assume that \(\{ (Y_{n}, Z_{n}), n\ge 1\}\) is a sequence of i.i.d. random vectors such that, for each \(n\ge 1\), \(Y_{n}\) and \(Z_{n}\) have the same distribution as X, and they are dependent according to the Farlie–Gumbel–Morgenstern copula
where \(-1\le \theta _{n} \le 1\).
Set \(X_{n}= Y_{(n+1)/2}\) if n is odd, and \(X_{n}=Z_{n/2}\) if n is even. Then \(\{X_{n}, n \ge 1\}\) is a sequence of WOD random variables with
where
(see Examples in Wang et al. [15]).
The following example satisfies the conditions of Theorem 1.1 but does not satisfy the conditions of Theorem 1.2.
Example 3.1
Let \(E \vert X \vert \log \vert X \vert <\infty \), and \(\theta _{n}=1\) if \(n=2^{k} \), \(k\ge 0\), and \(\theta _{n}=0\) otherwise. Then \(g_{L}(n)=g_{U}(n)=2^{k}\) for \(2^{k}\le n<2^{k+1}\), \(k\ge 0\). If we take \(g(x)=\max \{1, x\}\) and \(h(x)=\log x\), then \(g(x)\) and \(h(x)\) satisfy all the conditions of Theorem 1.1. In order to apply Theorem 1.2, we must take \(g(x)=\log ^{2} x\) by the moment condition. However, \(g(x)\) does not satisfy the condition \(\max \{g_{L}(n), g_{U}(n)\}\le g(n)\) of Theorem 1.2.
The following example satisfies the conditions of Theorem 1.2 but does not satisfy the conditions of Theorem 1.1.
Example 3.2
Let \(E \vert X \vert \sqrt{\log _{2} \log _{2} \vert X \vert }<\infty \), and \(\theta _{n}=1\) if \(n=2^{-1+2^{2^{k}}} \), \(k\ge 1\), and \(\theta _{n}=0\) otherwise, where \(\log _{2} x=\max \{1, \log _{e} x/\log _{e} 2\}\). Then we have that
If we take \(g(x)=\log _{2} \log _{2} x\), then \(g(x)\) satisfies all the conditions of Theorem 1.2. In order to apply Theorem 1.1, we must take \(h(x)= \sqrt{\log _{2} \log _{2} x}\) by the moment condition. However, \(g(x)\) and \(h(x)\) do not satisfy (1.6) of Theorem 1.1.
Availability of data and materials
Not applicable.
References
Bai, P., Chen, P., Sung, S.H.: On complete convergence and the strong law of large numbers for pairwise independent random variables. Acta Math. Hung. 142, 502–518 (2014)
Chen, P., Hu, T.-C., Volodin, A.: A note on the rate of complete convergence for maximum of partial sums for moving average processes in Rademacher type Banach spaces. Lobachevskii J. Math. 21, 45–55 (2006)
Chen, P., Sung, S.H.: A Spitzer-type law of large numbers for widely orthant dependent random variables. Stat. Probab. Lett. 154, 108544 (2019)
Chen, P., Sung, S.H.: Complete convergence for weighted sums of widely orthant-dependent random variables. J. Inequal. Appl. 2021, 45 (2021)
Chen, W., Wang, Y., Cheng, D.: An inequality of widely dependent random variables and its applications. Lith. Math. J. 56, 16–31 (2016)
Etemadi, N.: An elementary proof of the strong law of large numbers. Z. Wahrscheinlichkeitstheor. Verw. Geb. 55, 119–122 (1981)
Guan, L., Xiao, Y., Zhao, Y.: Complete moment convergence of moving average processes for m-WOD sequence. J. Inequal. Appl. 2021, 16 (2021)
Kruglov, V.M.: Strong law of large numbers, stability problems for stochastic models. In: Zolotarev, V.M., Kruglov, V.M., Korolev, V.Yu. (eds.) TVP/VSP (Moscow/Utrecht, 1994), pp. 139–150 (1994)
Lang, J., He, T., Yu, Z., Wu, Y., Wang, X.: Complete convergence for randomly weighted sums of random variables and its application in linear-time-invariant systems. Commun. Stat., Simul. Comput. (2021). https://doi.org/10.1080/03610918.2020.1870695
Matula, P.: A note on the almost sure convergence of sums of negatively dependent variables. Stat. Probab. Lett. 15, 209–213 (1992)
Petrov, V.V.: Sums of Independent Random Variables. Springer, Berlin (1975)
Shen, A., Yao, M., Wang, W., Volodin, A.: Exponential probability inequalities for WNOD random variables and their applications. Rev. R. Acad. Cienc. Exactas Fís. Nat., Ser. A Mat. 110, 251–268 (2016)
Spitzer, F.: A combinatorial lemma and its application to probability theory. Trans. Am. Math. Soc. 82, 323–339 (1956)
Utev, S., Peligrad, M.: Maximal inequalities and an invariance principle for a class of weakly dependent random variables. J. Theor. Probab. 16, 101–115 (2003)
Wang, K., Wang, Y., Gao, Q.: Uniform asymptotics for the finite-time ruin probability of a dependent risk model with a constant interest rate. Methodol. Comput. Appl. Probab. 15, 109–124 (2013)
Wang, Y., Cheng, D.: Basic renewal theorems for random walks with widely dependent increments. J. Math. Anal. Appl. 384, 597–606 (2011)
Wang, Y., Cui, Z., Wang, K., Ma, X.: Uniform asymptotics of the finite-time ruin probability for all times. J. Math. Anal. Appl. 390, 208–223 (2012)
Acknowledgements
The authors would like to thank the referees for careful reading of the manuscript and valuable suggestions which helped in improving an earlier version of this paper.
Funding
The research of Soo Hak Sung is supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2020R1F1A1A01050160).
Author information
Authors and Affiliations
Contributions
All authors read and approved the manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare that they have no competing interests.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Chen, P., Luo, J. & Sung, S.H. Further Spitzer’s law for widely orthant dependent random variables. J Inequal Appl 2021, 183 (2021). https://doi.org/10.1186/s13660-021-02718-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13660-021-02718-4