A tree is a graph which is connected and contains no circuits. Given any two vertices , let be the unique path connecting α and β. Define the graph distance to be the number of edges contained in the path .
Let T be an infinite tree which is locally finite and has no leaves. We choose a vertex as the root. We call the number of neighbors of a vertex of the tree the degree of this vertex. When the degrees of any vertices of T are uniformly bounded, we say T is a uniformly bounded tree. A tree is said to be a Bethe tree if each vertex has neighboring vertices, which is denoted by . A tree is said to be a Cayley tree if the root has only N neighbors and the other vertices have neighbors, which is denoted by . Both kinds of trees are common homogeneous trees. Obviously, the two kinds of homogeneous trees are the special cases of the uniformly bounded tree. When the context permits, uniformly bounded trees are all denoted simply by T.
Let T be an infinite tree with root o. The set of all vertices with distance n from the root is called the nth generation of T, which is denoted by . We denote by the subtree comprised of level 0 (the root o) through level n. For each vertex t, there is a unique path from o to t, and for the number of edges on this path. We denote the first predecessor of t by , the second predecessor of t by , and denote by the nth predecessor of t. For any two vertices s and t of tree T, write if s is on the unique path from the root o to t. We denote by the vertex farthest from o satisfying and .
Let be a measurable space, be a collection of random variables defined on and taking values in , where b is a positive integer. Let A be a subgraph of T, , and we denote by the number of vertices of A. Let the realization of be . Let μ be a probability measure on the measurable space . We will call μ the random field on tree T. Let the distribution of under probability measure μ be . is actually the marginal distribution of μ.
Definition 1 (see [1])
Let T be an infinite tree which is locally finite and has no leaves, S be a finite state space, be a collection of S-valued random variables defined on the measurable space , and let
(1)
be a distribution on S,
(2)
be a stochastic matrix on . Let be a probability measure on the measurable space . If for any vertex ,
(3)
and
will be called S-valued Markov chains indexed by an infinite tree T with initial distribution (1) and transition matrix (2) under probability measure .
Let be Markov chains indexed by tree T under probability measure defined above. It is easy to see that
(4)
In order to avoid technical problems, we always assume that , and are positive.
Definition 2 Let T be a uniformly bounded tree which has no leaves and be a collection of S-valued random variables defined on , , be a positive stochastic matrix, μ, be two probability measures on , and be Markov chains indexed by tree T under probability measure determined by P. Assume that is always strictly positive. Let
will be called the asymptotic logarithmic likelihood ratio.
Remark 1 If , then . In Lemma 1 we will show that in a general case μ-a.e., hence can be regarded as a measure of the Markov approximation of an arbitrary random field on T.
The tree model has recently drawn increasing interest from specialists in physics, probability and information theory. Benjamini and Peres [1] have given the notion of the tree-indexed homogeneous Markov chains and studied the recurrence and ray-recurrence for them. Berger and Ye [2] have studied the existence of entropy rate for some stationary random fields on a homogeneous tree. Ye and Berger [3] have studied the asymptotic equipartition property (AEP) in the sense of convergence in probability for a PPG-invariant and ergodic random field on a homogeneous tree. Recently, Yang [4] has studied some strong limit theorems for countable homogeneous Markov chains indexed by a homogeneous tree and the strong law of large numbers and the asymptotic equipartition property (AEP) for finite homogeneous Markov chains indexed by a homogeneous tree. Huang and Yang [5] have studied the strong law of large numbers for Markov chains indexed by an infinite tree with uniformly bounded degree. Liu and Wang [6] have studied the small deviation theorems between the arbitrary random fields and the Markov chain fields on the Cayley tree. Peng, Yang, and Wang [7] have further studied a class of small deviation theorems for functionals of random fields on a homogeneous tree which partially extend the result of [6].
In this paper, by introducing the asymptotic logarithmic likelihood ratio as a measure of the Markov approximation of the arbitrary random field on a uniformly bounded tree, and by constructing a non-negative martingale, we obtain the following two results: a class of small deviation theorems of functionals and a class of small deviation theorems of the frequencies of occurrence of states for random fields on a uniformly bounded tree. In fact, our present outcomes can imply the case in [5] and [7].
Lemma 1 Let T be a uniformly bounded tree which has no leaves. Let , be two probability measures on , , be a sequence of positive random variables such that
(7)
Then
(8)
Proof The proof of this lemma is similar to that of Lemma 1 of [6], so we omit it. □
Remark 2 Let , and in Lemma 1, by (8) there exists , such that
(9)
hence we have , .
From this remark, we know that and the sequence of are existent.
Let T be a uniformly bounded tree, . Let
that is,
where () is the Kronecker δ-function
Let the degree of each vertex σ () on the tree T be . Since T is a uniformly bounded tree which has no leaves, we know that there are two positive numbers m and M such that .
Lemma 2 Let T be a uniformly bounded tree which has no leaves, , be a positive stochastic matrix, μ, be two probability measures on , be Markov chains indexed by T under probability measure determined by P, be denoted by (6), M be defined as above, be a constant, and
where , . Then
(14)
Proof By using a similar proof as that of Lemma of [6], we can obtain
(15)
Since , this corollary follows from (15). □
In the following, we always let , , , and denote by , where is defined as above. Let
Corollary 1 Let m, M be defined as above. Under the assumption of Lemma 2, we have
(18)
Proof Because T is a uniformly bounded tree and has no leaves, then . By Lemma 2 we have
(19)
The proof is finished. □
Lemma 3 Let T be a uniformly bounded tree which has no leaves, , be a positive stochastic matrix, μ, be two probability measures on , be Markov chains indexed by T under probability measure determined by P, be functions defined on , (where o is the root of the tree T), , λ be a real number. Let
(20)
where is the expectation under probability measure . Then is a non-negative martingale under probability measure μ.
Proof The proof of this lemma is similar to that of Lemma 3 of [7], so we omit it. □