Throughout this paper, for two positive functions \(f(x)\) and \(g(x)\), we write
$$\begin{aligned}& f(x)\sim g(x) \quad \mbox{if }\lim_{x\rightarrow\infty} f(x)/g(x)=1; \\& f(x)\lesssim g(x)\quad \mbox{if }\limsup_{x\rightarrow\infty} f(x)/g(x)\leqslant1\quad \mbox{and} \\& f(x)\gtrsim g(x)\quad \mbox{if } \liminf_{x\rightarrow\infty} f(x)/g(x)\geqslant1. \end{aligned}$$
For two positive bivariate functions \(f(x,n)\) and \(g(x,n)\), we say that the asymptotic relation \(f(x,n)\lesssim g(x,n)\) holds uniformly for x in a nonempty set \(\Delta_{n}\) if
$$\limsup_{n\rightarrow\infty}\sup_{x\in\Delta_{n}} \frac{f(x;n)}{g(x;n)} \leqslant1. $$
First, we show that the accumulated aggregate claim \(S_{n}\) in (1.1) has the same distribution with another random walk.
Lemma 2.1
Let
\(\{Y, Y_{j},j\geqslant1\}\)
be a sequence of i.i.d. non-negative random variables such that
Y
and
X
in (1.1) are identically distributed, then
$$S_{n} \stackrel{d}{=}\sum_{j=1}^{N_{1}+N_{2}+\cdots+N_{n}}Y_{j}, $$
where
\(\stackrel{d}{=}\)
denotes the identical distribution.
Proof
For any real r, the moment generating function of \(S_{n}\) is expressed as
$$\begin{aligned} M_{s} =&E\bigl\{ \exp\{rS_{n}\}\bigr\} =E\Biggl\{ \exp\Biggl\{ r\sum_{i=1}^{n}\sum _{j=1}^{N_{i}}X_{i,j}\Biggr\} \Biggr\} \\ =&E\biggl\{ e^{r\sum _{j=1}^{N_{1}}X_{1,j}}e^{r\sum _{j=1}^{N_{2}}X_{2,j}}\cdots e^{r\sum _{j=1}^{N_{n}}X_{k,j}}\cdot\sum _{n_{1},n_{2},\ldots ,n_{n}}I_{(N_{1}=n_{1},N_{2}=n_{2},\ldots, N_{n}=n_{n})}\biggr\} \\ =&\sum_{n_{1},n_{2},\ldots ,n_{n}}E\bigl\{ e^{r\sum _{j=1}^{n_{1}}X_{1,j}}e^{r\sum _{j=1}^{n_{2}}X_{2,j}} \cdots e^{r\sum _{j=1}^{n_{n}}X_{n,j}}\cdot I_{(N_{1}=n_{1},N_{2}=n_{2},\ldots, N_{n}=n_{n})}\bigr\} \\ =&\sum_{n_{1},n_{2},\ldots ,n_{n}}\bigl(Ee^{rX} \bigr)^{\sum _{i=1}^{n}n_{i}}\cdot P\{ N_{1}=n_{1},N_{2}=n_{2}, \ldots, N_{n}=n_{n}\} \\ =&E\bigl\{ (M_{X})^{N_{1}+N_{2}+\cdots+ N_{n}}\bigr\} . \end{aligned}$$
On the other hand, we have
$$\begin{aligned}& E\Biggl\{ \exp\Biggl\{ r\sum_{j=1}^{N_{1}+N_{2}+\cdots+N_{n}}Y_{j} \Biggr\} \Biggr\} \\& \quad = E\Biggl\{ \exp\Biggl\{ r\sum_{j=1}^{N_{1}+N_{2}+\cdots+N_{n}}Y_{j} \Biggr\} \cdot\sum_{m}I_{(N_{1}+N_{2}+\cdots +N_{n}=m)}\Biggr\} \\& \quad = \sum_{m}E\Biggl\{ \exp\Biggl\{ r\sum _{j=1}^{m}Y_{j}\Biggr\} \cdot I_{(N_{1}+N_{2}+\cdots +N_{n}=m)}\Biggr\} \\& \quad =\sum_{m}\bigl(Ee^{rY} \bigr)^{m}\cdot P\{N_{1}+N_{2}+\cdots +N_{n}=m\} =E\bigl\{ (M_{Y})^{N_{1}+N_{2}+\cdots+ N_{n}}\bigr\} , \end{aligned}$$
where \(m=n_{1}+n_{2}+\cdots+n_{n}\). Hence, by the uniqueness of the moment generating function, we know that \(S_{k}\) and \(\sum_{j=1}^{N_{1}+\cdots+N_{n}}Y_{j}\) have the same distribution. □
Next, the following lemma establishes the MDP for \(\{N_{t}, t\geq0\}\).
Lemma 2.2
Assume
\(\{N_{t}, t\geq1\}\)
defined by (1.2), and
\(N_{0}\ge0\)
be a deterministic integer. Let
\(b_{n}\)
be a sequence of positive numbers satisfying
\(b_{n}\rightarrow\infty\)
and
\(b_{n}/n\rightarrow0\), we have
$$ \limsup_{n\rightarrow\infty}\frac{1}{b_{n}}\log \mathbb{P} \Biggl\{ \frac{1}{\sqrt{nb_{n}}}\sum_{t=1}^{n} \biggl(N_{t}-{a_{0}\over 1-a_{1}} \biggr)\in H \Biggr\} \leqslant- \inf_{x\in H}I_{M}(x) $$
(2.1)
for each closed set
\(H\subset \mathbb{R}\); and
$$ \liminf_{n\rightarrow\infty}\frac{1}{b_{n}}\log \mathbb{P} \Biggl\{ \frac{1}{\sqrt{nb_{n}}}\sum_{t=1}^{n} \biggl(N_{t}-{a_{0}\over 1-a_{1}} \biggr)\in G \Biggr\} \geqslant- \inf_{x\in G}I_{M}(x) $$
(2.2)
for each open set
\(G\in\mathbb{R}\), where the rate function
\(I_{M}(\cdot)\)
is given as
$$I_{M}(x)={x^{2}\over 2\sigma^{2}} ,\quad x\in\mathbb{R}, \textit{where } \sigma^{2} =\mathbb{E}\bigl(\operatorname{Var}(N_{t}| \mathscr{F}_{t-1})\bigr)=\frac{a_{0}}{1-a_{1}}. $$
Proof
By the Gärtner-Ellis theorem (Theorem 2.3.6, p.44, Dembo and Zeitouni [13]), all we need to show is that
$$ \lim_{n\to\infty}{1\over b_{n}}\log\mathbb{E} \exp \Biggl\{ \beta \sqrt{b_{n}\over n} \sum_{t=1}^{n}(N_{t}- \mathbb{E}N_{1}) \Biggr\} ={1\over 2}\sigma^{2} \beta^{2},\quad \beta\in\mathbb{R}. $$
(2.3)
Let \(\beta\in\mathbb{R}\) be fixed but arbitrary and write
$$l_{n}=a_{1}\bigl(e^{\theta_{n}}-1\bigr), $$
where \(\theta_{n}=\beta\sqrt{\frac{b_{n}}{n}}\). Observe that for any \(t\ge1\),
$$\begin{aligned}& \mathbb{E} \bigl[\exp \bigl\{ \theta_{n} (N_{t}-\mathbb {E}N_{1})-l_{n}N_{t-1} \bigr\} \vert{ \mathcal{F}}_{t-1} \bigr] \\& \quad =\exp \{-l_{n}N_{t-1}-\theta_{n} \mathbb{E}N_{1} \}\mathbb{E} \bigl[\exp\{\theta_{n} N_{t} \} \vert{\mathcal{F}}_{t-1} \bigr] \\& \quad =\exp \{-l_{n}N_{t-1}-\theta_{n}\mathbb{E} N_{1} \}\exp \bigl\{ (a_{0}+a_{1}N_{t-1}) \bigl(e^{\theta_{n}}-1\bigr) \bigr\} \\& \quad =\exp \biggl\{ \frac{a_{0}}{1-a_{1}}\bigl[(1-a_{1}) \bigl(e^{\theta_{n}}-1\bigr)-\theta_{n}\bigr] \biggr\} . \end{aligned}$$
Hence,
$$\begin{aligned}& \mathbb{E}\exp \Biggl\{ \sum_{t=1}^{n+1} \bigl\{ \theta_{n} (N_{t}-\mathbb{E} N_{1})-l_{n}N_{t-1} \bigr\} \Biggr\} \\& \quad = \biggl(\exp \biggl\{ \frac{a_{0}}{1-a_{1}}\bigl[(1-a_{1}) \bigl(e^{\theta_{n}}-1\bigr)-\theta _{n}\bigr] \biggr\} \biggr)^{n+1}. \end{aligned}$$
(2.4)
On the other hand,
$$\begin{aligned}& \mathbb{E}\exp \Biggl\{ \sum_{t=1}^{n+1} \bigl\{ \theta_{n} (N_{t}-\mathbb{E} N_{1})-l_{n}N_{t-1} \bigr\} \Biggr\} \\& \quad =\exp \bigl\{ -(n+1)l_{n}\mathbb{E}N_{1} \bigr\} \mathbb{E} \Biggl\{ \exp \bigl\{ \theta_{n}N_{n+1}-l_{n}N_{0} \bigr\} \exp \Biggl\{ (\theta_{n}-l_{n} )\sum _{t=1}^{n} (N_{t}-\mathbb{E}N_{1}) \Biggr\} \Biggr\} \\& \quad =\exp \bigl\{ -(n+1) l_{n}\mathbb{E}N_{1} \bigr\} \mathbb{E}\exp \Biggl\{ \theta_{n}N_{n+1}+ ( \theta_{n}-l_{n} )\sum_{t=1}^{n} (N_{t}-\mathbb{E}N_{1}) \Biggr\} . \end{aligned}$$
(2.5)
Combining (2.4) and (2.5) and by the definition of \(l_{n}\),
$$\begin{aligned}& \biggl(\exp \biggl\{ \frac{a_{0}}{1-a_{1}}\bigl[(1-a_{1}) \bigl(e^{\theta_{n}}-1\bigr)-\theta _{n}\bigr] \biggr\} \biggr)^{n+1} \bigl(\exp \bigl\{ a_{1}\bigl(e^{\theta_{n}}-1 \bigr)\mathbb{E} N_{1} \bigr\} \bigr)^{n+1} \\& \quad = \biggl(\exp \biggl\{ \frac{a_{0}}{1-a_{1}}\bigl[e^{\theta_{n}}-1- \theta_{n}\bigr] \biggr\} \biggr)^{n+1} \\& \quad = \mathbb{E}\exp \Biggl\{ \theta_{n}N_{n+1}+ ( \theta_{n}-l_{n} )\sum_{t=1}^{n} (N_{t}-\mathbb{E}N_{1}) \Biggr\} . \end{aligned}$$
By the Taylor expansion \(e^{\theta_{n}}=1+\theta_{n}+\frac{1}{2}\theta_{n}^{2}+o(\theta_{n}^{2})\), the right-hand side is asymptotically equivalent to
$$\exp \biggl\{ {1\over 2}\sigma^{2} \beta^{2} b_{n} \biggr\} . $$
Thus
$$\lim_{n\to\infty}{1\over b_{n}}\log\mathbb{E}\exp \Biggl\{ \theta_{n}N_{n+1}+ (\theta_{n}-l_{n} )\sum _{t=1}^{n} (N_{t}- \mathbb{E}N_{1}) \Biggr\} ={1\over 2}\sigma^{2} \beta^{2}. $$
By the fact that \(\sup_{t\geq1}\mathbb{E}\exp\{\theta N_{t}\} <\infty\) (\(\forall\theta>0\)) (see Li [2]) and \(\theta_{n}\to0\), a standard argument of an exponential approximation by the Hölder inequality enables us to remove the term \(\theta_{n}N_{n+1}\) from the above equation. So we have
$$ \lim_{n\to\infty}{1\over b_{n}}\log\mathbb{E} \exp \Biggl\{ (\theta_{n}-l_{n} )\sum _{t=1}^{n} (N_{t}-\mathbb{E}N_{1}) \Biggr\} ={1\over 2}\sigma^{2} \beta^{2}. $$
(2.6)
By the Hölder inequality, therefore,
$$\begin{aligned}& \mathbb{E}\exp \Biggl\{ (\theta_{n}-l_{n} )\sum _{t=1}^{n} (N_{t}-\mathbb{E} N_{1}) \Biggr\} \\& \quad \le \Biggl(\mathbb{E}\exp \Biggl\{ \theta_{n}\sum _{t=1}^{n} (N_{t}-\mathbb{E} N_{1}) \Biggr\} \Biggr)^{\theta_{n}-l_{n}\over \theta_{n}}\le\mathbb{E}\exp \Biggl\{ \theta_{n}\sum_{t=1}^{n} (N_{t}-\mathbb{E}N_{1}) \Biggr\} , \end{aligned}$$
where the second step follows from the fact that
$$\mathbb{E}\exp \Biggl\{ \theta_{n}\sum_{t=1}^{n} (N_{t}-\mathbb {E}N_{1}) \Biggr\} \ge1, $$
which can be proved by Jensen’s inequality.
By the fact that \(\theta_{n}=\beta\sqrt{b_{n}\over n}\) and by (2.6), we obtain the lower bound
$$ \liminf_{n\to\infty}{1\over b_{n}}\log \mathbb{E}\exp \Biggl\{ \beta\sqrt{b_{n}\over n}\sum _{t=1}^{n} (N_{t}-\mathbb{E} N_{1}) \Biggr\} \ge{1\over 2}\sigma^{2} \beta^{2}. $$
(2.7)
On the other hand, given a small number \(0<\delta<1\), \(\theta_{n}-l_{n} >(1-\delta)\theta_{n} =(1-\delta)\beta\sqrt{b_{n}\over n} \) as n is sufficiently large. By the Hölder inequality
$$\begin{aligned}& \mathbb{E}\exp \Biggl\{ (1-\delta)\beta\sqrt{b_{n}\over n}\sum _{t=1}^{n} (N_{t}-\mathbb{E}N_{1}) \Biggr\} \\& \quad \le \Biggl(\mathbb{E}\exp \Biggl\{ (\theta_{n}-l_{n} ) \sum_{t=1}^{n} (N_{t}-\mathbb{E} N_{1}) \Biggr\} \Biggr)^{(1-\delta)\theta_{n}\over \theta_{n}-l_{n}} \le \mathbb{E}\exp \Biggl\{ ( \theta_{n}-l_{n} )\sum_{t=1}^{n} (N_{t}-\mathbb{E} N_{1}) \Biggr\} . \end{aligned}$$
By (2.6), therefore,
$$\limsup_{n\to\infty}{1\over b_{n}}\log \mathbb{E}\exp \Biggl\{ (1-\delta)\beta\sqrt{b_{n}\over n}\sum_{t=1}^{n} (N_{t}-\mathbb{E} N_{1}) \Biggr\} \le{1\over 2} \sigma^{2} \beta^{2}. $$
Since \(\beta\in\mathbb{R}\) can be arbitrary, replacing it by \((1-\delta)^{-1}\beta\) in the above leads to
$$\limsup_{n\to\infty}{1\over b_{n}}\log \mathbb{E}\exp \Biggl\{ \beta\sqrt{b_{n}\over n}\sum_{t=1}^{n} (N_{t}-\mathbb{E} N_{1}) \Biggr\} \le{1\over 2} \sigma^{2} \biggl({\beta\over 1-\delta} \biggr)^{2}. $$
Letting \(\delta\to0^{+}\) on the right-hand side yields the desired upper bound, which, together with the lower bound (2.7), leads to (2.3). □
As a consequence of Lemma 2.2, for every \(\eta>0\), considering the closed set \(H=\{x,|x|\geq\eta\}\) and \(b_{n}=\sqrt{n}\), we have
$$ \mathbb{P} \Biggl(\frac{1}{n}\Biggl\vert \sum _{t=1}^{n}(N_{t}-\mathbb {E}N_{1})\Biggr\vert \geq \eta \Biggr)\leq\mathbb{P} \Biggl( \frac{1}{n^{3/4}}\Biggl\vert \sum_{t=1}^{n}(N_{t}- \mathbb{E}N_{1})\Biggr\vert \geq\eta \Biggr) \le \exp \{-c_{\eta}\sqrt{n}\}, $$
(2.8)
where \(c_{\eta}=\frac{\eta^{2}}{2\sigma^{2}}>0\) is independent of n. This gives the genuine exponential decay for the probability that the sample average deviates from its expectation.
The last lemma below is a restatement of Theorem 3.1 of Ng et al. [7].
Lemma 2.3
Let
\(\{Y, Y_{j}, j\geqslant1\}\)
be a sequence of i.i.d. non-negative random variables with common distribution function
\(F_{Y}\in\mathscr{C}\)
and finite expectation
μ, let
\(Q_{n}=\sum_{j=1}^{n}Y_{j}\). Then, for any fixed
\(\gamma>0\),
$$ P(Q_{n}-n\mu>y)\sim n \overline{F}_{Y}(y)\quad (n \rightarrow\infty) \textit{ uniformly for } y\geq\gamma n. $$