A tree is a graph G=\{T,E\} which is connected and contains no circuits. Given any two vertices \alpha \ne \beta \in T, let \overline{\alpha \beta} be the unique path connecting *α* and *β*. Define the graph distance d(\alpha ,\beta ) to be the number of edges contained in the path \overline{\alpha \beta}.

Let *T* be an infinite tree which is locally finite and has no leaves. We choose a vertex as the root. We call the number of neighbors of a vertex of the tree the degree of this vertex. When the degrees of any vertices of *T* are uniformly bounded, we say *T* is a uniformly bounded tree. A tree is said to be a Bethe tree if each vertex has N+1 neighboring vertices, which is denoted by {T}_{B,N}. A tree is said to be a Cayley tree if the root has only *N* neighbors and the other vertices have N+1 neighbors, which is denoted by {T}_{C,N}. Both kinds of trees are common homogeneous trees. Obviously, the two kinds of homogeneous trees are the special cases of the uniformly bounded tree. When the context permits, uniformly bounded trees are all denoted simply by *T*.

Let *T* be an infinite tree with root *o*. The set of all vertices with distance *n* from the root is called the nth generation of *T*, which is denoted by {L}_{n}. We denote by {T}^{(n)} the subtree comprised of level 0 (the root *o*) through level *n*. For each vertex *t*, there is a unique path from *o* to *t*, and |t| for the number of edges on this path. We denote the first predecessor of *t* by {1}_{t}, the second predecessor of *t* by {2}_{t}, and denote by {n}_{t} the nth predecessor of *t*. For any two vertices *s* and *t* of tree *T*, write s\le t if *s* is on the unique path from the root *o* to *t*. We denote by s\wedge t the vertex farthest from *o* satisfying s\wedge t\le s and s\wedge t\le t.

Let (\mathrm{\Omega},\mathcal{F}) be a measurable space, \{{X}_{t},t\in T\} be a collection of random variables defined on (\mathrm{\Omega},\mathcal{F}) and taking values in S=\{0,1,\dots ,b-1\}, where *b* is a positive integer. Let *A* be a subgraph of *T*, {X}^{A}=\{{X}_{t},t\in A\}, and we denote by |A| the number of vertices of *A*. Let the realization of {X}^{{T}^{(n)}} be {x}^{{T}^{(n)}}. Let *μ* be a probability measure on the measurable space (\mathrm{\Omega},\mathcal{F}). We will call *μ* the random field on tree *T*. Let the distribution of \{{X}_{t},t\in T\} under probability measure *μ* be \mu ({x}^{{T}^{(n)}})=\mu ({X}^{{T}^{(n)}}={x}^{{T}^{(n)}}). \mu ({x}^{{T}^{(n)}}) is actually the marginal distribution of *μ*.

**Definition 1** (see [1])

Let *T* be an infinite tree which is locally finite and has no leaves, *S* be a finite state space, \{{X}_{t},t\in T\} be a collection of *S*-valued random variables defined on the measurable space (\mathrm{\Omega},\mathcal{F}), and let

be a distribution on *S*,

P=(P(y|x)),\phantom{\rule{1em}{0ex}}x,y\in S,

(2)

be a stochastic matrix on {S}^{2}. Let {\mu}_{P} be a probability measure on the measurable space (\mathrm{\Omega},\mathcal{F}). If for any vertex t\in T,

{\mu}_{P}({X}_{t}=y|{X}_{{1}_{t}}=x\text{and}{X}_{s}\text{for}s\wedge t\le {1}_{t})={\mu}_{P}({X}_{t}=y|{X}_{{1}_{t}}=x)=P(y|x),\phantom{\rule{1em}{0ex}}x,y\in S,

(3)

and

{\mu}_{P}({X}_{0}=x)=p(x),\phantom{\rule{1em}{0ex}}x\in S,

\{{X}_{t},t\in T\} will be called *S*-valued Markov chains indexed by an infinite tree *T* with initial distribution (1) and transition matrix (2) under probability measure {\mu}_{P}.

Let \{{X}_{t},t\in T\} be Markov chains indexed by tree *T* under probability measure {\mu}_{P} defined above. It is easy to see that

{\mu}_{P}\left({x}^{{T}^{(n)}}\right)={\mu}_{P}({X}^{{T}^{(n)}}={x}^{{T}^{(n)}})=p({x}_{o})\prod _{m=1}^{n}\prod _{t\in {L}_{m}}P({x}_{t}|{x}_{{1}_{t}}).

(4)

In order to avoid technical problems, we always assume that \mu ({x}^{{T}^{(n)}}), P(y|x) and p(x) are positive.

**Definition 2** Let *T* be a uniformly bounded tree which has no leaves and \{{X}_{t},t\in T\} be a collection of *S*-valued random variables defined on (\mathrm{\Omega},\mathcal{F}), P=(P(y|x)), x,y\in S be a positive stochastic matrix, *μ*, {\mu}_{P} be two probability measures on (\mathrm{\Omega},\mathcal{F}), and \{{X}_{t},t\in T\} be Markov chains indexed by tree *T* under probability measure {\mu}_{P} determined by *P*. Assume that \mu ({x}^{{T}^{(n)}}) is always strictly positive. Let

\phi (\omega ) will be called the asymptotic logarithmic likelihood ratio.

**Remark 1** If \mu ={\mu}_{P}, then \phi (\omega )\equiv 0. In Lemma 1 we will show that in a general case \phi (\omega )\ge 0 *μ*-a.e., hence \phi (\omega ) can be regarded as a measure of the Markov approximation of an arbitrary random field on *T*.

The tree model has recently drawn increasing interest from specialists in physics, probability and information theory. Benjamini and Peres [1] have given the notion of the tree-indexed homogeneous Markov chains and studied the recurrence and ray-recurrence for them. Berger and Ye [2] have studied the existence of entropy rate for some stationary random fields on a homogeneous tree. Ye and Berger [3] have studied the asymptotic equipartition property (AEP) in the sense of convergence in probability for a PPG-invariant and ergodic random field on a homogeneous tree. Recently, Yang [4] has studied some strong limit theorems for countable homogeneous Markov chains indexed by a homogeneous tree and the strong law of large numbers and the asymptotic equipartition property (AEP) for finite homogeneous Markov chains indexed by a homogeneous tree. Huang and Yang [5] have studied the strong law of large numbers for Markov chains indexed by an infinite tree with uniformly bounded degree. Liu and Wang [6] have studied the small deviation theorems between the arbitrary random fields and the Markov chain fields on the Cayley tree. Peng, Yang, and Wang [7] have further studied a class of small deviation theorems for functionals of random fields on a homogeneous tree which partially extend the result of [6].

In this paper, by introducing the asymptotic logarithmic likelihood ratio as a measure of the Markov approximation of the arbitrary random field on a uniformly bounded tree, and by constructing a non-negative martingale, we obtain the following two results: a class of small deviation theorems of functionals and a class of small deviation theorems of the frequencies of occurrence of states for random fields on a uniformly bounded tree. In fact, our present outcomes can imply the case in [5] and [7].

**Lemma 1** *Let* *T* *be a uniformly bounded tree which has no leaves*. *Let* {\mu}_{1}, {\mu}_{2} *be two probability measures on* (\mathrm{\Omega},\mathcal{F}), D\in \mathcal{F}, \{{\tau}_{n},n\ge 1\} *be a sequence of positive random variables such that*

\underset{n\to \mathrm{\infty}}{lim\hspace{0.17em}inf}\frac{{\tau}_{n}}{|{T}^{(n)}|}>0\phantom{\rule{1em}{0ex}}{\mu}_{1}\mathit{\text{-a.e. on}}D.

(7)

*Then*

\underset{n\to \mathrm{\infty}}{lim\hspace{0.17em}sup}\frac{1}{{\tau}_{n}}ln\frac{{\mu}_{2}({X}^{{T}^{(n)}})}{{\mu}_{1}({X}^{{T}^{(n)}})}\le 0\phantom{\rule{1em}{0ex}}{\mu}_{1}\mathit{\text{-a.e. on}}D.

(8)

*Proof* The proof of this lemma is similar to that of Lemma 1 of [6], so we omit it. □

**Remark 2** Let {\mu}_{1}=\mu, {\mu}_{2}={\mu}_{P} and {\tau}_{n}=|{T}^{(n)}| in Lemma 1, by (8) there exists A\in \mathcal{F}, \mu (A)=1 such that

\underset{n\to \mathrm{\infty}}{lim\hspace{0.17em}sup}\frac{1}{|{T}^{(n)}|}ln\frac{{\mu}_{P}({X}^{{T}^{(n)}})}{\mu ({X}^{{T}^{(n)}})}\le 0,\phantom{\rule{1em}{0ex}}\omega \in A,

(9)

hence we have \phi (\omega )\ge 0, \omega \in A.

From this remark, we know that D\in \mathcal{F} and the sequence of \{{\tau}_{n},n\ge 1\} are existent.

Let *T* be a uniformly bounded tree, k,l\in S. Let

that is,

{S}_{n}(k)=\sum _{t\in {T}^{(n)}}{\delta}_{k}({X}_{t}),\phantom{\rule{2em}{0ex}}{S}_{n}(k,l)=\sum _{m=1}^{n}\sum _{t\in {L}_{m}}{\delta}_{k}({X}_{{1}_{t}}){\delta}_{l}({X}_{t}),

where {\delta}_{k}(\cdot ) (k\in S) is the Kronecker *δ*-function

{\delta}_{k}(x)=\{\begin{array}{cc}1,\hfill & \text{if}x=k,\hfill \\ 0,\hfill & \text{if}x\ne k.\hfill \end{array}

Let the degree of each vertex *σ* (\sigma \ne o) on the tree *T* be d(\sigma ). Since *T* is a uniformly bounded tree which has no leaves, we know that there are two positive numbers *m* and *M* such that 2\le m\le d(\sigma )\le M.

**Lemma 2** *Let* *T* *be a uniformly bounded tree which has no leaves*, P=(P(y|x)), x,y\in S *be a positive stochastic matrix*, *μ*, {\mu}_{P} *be two probability measures on* (\mathrm{\Omega},\mathcal{F}), \{{X}_{t},t\in T\} *be Markov chains indexed by* *T* *under probability measure* {\mu}_{P} *determined by* *P*, \phi (\omega ) *be denoted by* (6), *M* *be defined as above*, 0\le c<ln{(1-{a}_{k})}^{-1} *be a constant*, *and*

*where* {a}_{k}=min\{P(k|i),i\in S\}, {b}_{k}=max\{P(k|i),i\in S\}. *Then*

\underset{n\to \mathrm{\infty}}{lim\hspace{0.17em}inf}\frac{{S}_{n-1}(k)}{|{T}^{(n)}|}\ge \frac{{M}_{k}}{M-1}\phantom{\rule{1em}{0ex}}\mu \mathit{\text{-a.e. on}}D(c).

(14)

*Proof* By using a similar proof as that of Lemma of [6], we can obtain

\underset{n\to \mathrm{\infty}}{lim\hspace{0.17em}inf}\frac{{S}_{n}(k)}{|{T}^{(n)}|}\ge {M}_{k}\phantom{\rule{1em}{0ex}}\mu \text{-a.e. on}D(c).

(15)

Since \frac{1}{|{T}^{(n+1)}|}\ge \frac{1}{M-1}\frac{1}{|{T}^{(n)}|}, this corollary follows from (15). □

In the following, we always let N\ge 0, k\in S, {d}^{0}(t)=1, and denote by {d}^{N}(t)=|\tau \in T:{N}_{\tau}=t|, where {N}_{\tau} is defined as above. Let

**Corollary 1** *Let* *m*, *M* *be defined as above*. *Under the assumption of Lemma * 2, *we have*

\underset{n\to \mathrm{\infty}}{lim\hspace{0.17em}inf}\frac{{S}_{k}^{N+1}({T}^{(n-1)})}{|{T}^{(n)}|}\ge \frac{{M}_{k}{(m-1)}^{N+1}}{M-1}\phantom{\rule{1em}{0ex}}\mu \mathit{\text{-a.e. on}}D(c).

(18)

*Proof* Because *T* is a uniformly bounded tree and has no leaves, then {(m-1)}^{N}\le {d}^{N}(t)\le {(M-1)}^{N}. By Lemma 2 we have

\begin{array}{rcl}\underset{n\to \mathrm{\infty}}{lim\hspace{0.17em}inf}\frac{{S}_{k}^{N+1}({T}^{(n-1)})}{|{T}^{(n)}|}& =& \underset{n\to \mathrm{\infty}}{lim\hspace{0.17em}inf}\frac{{\sum}_{t\in {T}^{(n-1)}}{\delta}_{k}({X}_{t}){d}^{N+1}(t)}{|{T}^{(n)}|}\\ \ge & {(m-1)}^{N+1}\underset{n\to \mathrm{\infty}}{lim\hspace{0.17em}inf}\frac{{\sum}_{t\in {T}^{(n-1)}}{\delta}_{k}({X}_{t})}{|{T}^{(n)}|}\\ =& {(m-1)}^{N+1}\underset{n\to \mathrm{\infty}}{lim\hspace{0.17em}inf}\frac{{S}_{n-1}(k)}{|{T}^{(n)}|}\\ \ge & \frac{{M}_{k}{(m-1)}^{N+1}}{M-1}\phantom{\rule{1em}{0ex}}\mu \text{-a.e. on}D(c).\end{array}

(19)

The proof is finished. □

**Lemma 3** *Let* *T* *be a uniformly bounded tree which has no leaves*, P=(P(y|x)), x,y\in S *be a positive stochastic matrix*, *μ*, {\mu}_{P} *be two probability measures on* (\mathrm{\Omega},\mathcal{F}), \{{X}_{t},t\in T\} *be Markov chains indexed by* *T* *under probability measure* {\mu}_{P} *determined by* *P*, \{{g}_{t}(x,y),t\in T\} *be functions defined on* {S}^{2}, {L}_{0}=\{o\} (*where* *o* *is the root of the tree* *T*), {\mathcal{F}}_{n}=\sigma ({X}^{{T}^{(n)}}), *λ* *be a real number*. *Let*

{t}_{n}(\lambda ,\omega )=\frac{{e}^{\lambda {\sum}_{t\in {T}^{(n)}\setminus \{o\}}{g}_{t}({X}_{{1}_{t}},{X}_{t})}}{{\prod}_{t\in {T}^{(n)}\setminus \{o\}}{E}_{{\mu}_{P}}[{e}^{\lambda {g}_{t}({X}_{{1}_{t}},{X}_{t})}|{X}_{{1}_{t}}]}\frac{{\mu}_{P}({X}^{{T}^{(n)}})}{\mu ({X}^{{T}^{(n)}})},

(20)

*where* {E}_{{\mu}_{P}} *is the expectation under probability measure* {\mu}_{P}. *Then* ({t}_{n}(\lambda ,\omega ),{\mathcal{F}}_{n},n\ge 1) *is a non*-*negative martingale under probability measure* *μ*.

*Proof* The proof of this lemma is similar to that of Lemma 3 of [7], so we omit it. □