tailieunhanh - A class of generalized Shannon-McMillan theorems for arbitrary discrete information source

In this study, a class of strong limit theorems for the relative entropy densities of random sum of arbitrary information source are discussed by constructing the joint distribution and nonnegative super martingales. | Turk J Math 35 (2011) , 729 – 736. ¨ ITAK ˙ c TUB doi: A class of generalized Shannon-McMillan theorems for arbitrary discrete information source ∗ Kangkang Wang Abstract In this study, a class of strong limit theorems for the relative entropy densities of random sum of arbitrary information source are discussed by constructing the joint distribution and nonnegative super martingales. As corollaries, some Shannon-McMillan theorems for arbitrary information source, m th-order Markov information source and non-memory information source are obtained and some results for the discrete information source which have been obtained by authors are extended. Key Words: Shannon-McMillan theorem, the consistent distribution, arbitrary information source, relative entropy density, m-order Markov information source, non-memory information source. 1. Introduction Suppose {Xn , n ≥ 0} is an arbitrary information source defined on any probability space (Ω, F , P ) taking values in the alphabet set S = {s1 , s2 , · · · } . Also let us denote the joint distribution of {Xn , n ≥ 0} as P (X0 = x0 , · · · , Xn = xn ) = p(x0 , · · · , xn ) > 0, xi ∈ S, 0 ≤ i ≤ n. (1) Denote fn (ω) = − 1 log p(X0 , · · · , Xn ), n+1 (2) where log is the natural logarithmic, fn (ω) is called the relative entropy density of {Xi , 0 ≤ i ≤ n} . Denote the conditional probability as follows: p(Xn = xn |X0 = x0 , · · · , Xn−1 = xn−1 ) = pn (xn |x0, · · · , xn−1). (3) Then P (X0 , · · · , Xn ) = p(X0 ) n pk (Xk |X0 , · · · , Xk−1 ), (4) k=1 1 log pk (Xk |X0 , · · · , Xk−1 )]. [log p(X0 ) + n+1 n fn (ω) = − (5) k=1 2000 AMS Mathematics Subject Classification: 60F15. author would like to thank the anonymous referee for his careful and valuable suggestions. This work is supported by Natural Science Foundation of High University of Jiangsu Province (09KJD110002). ∗The 729 WANG Definition 1 Suppose σn (ω) is an increasing nonnegative stochastic sequence, and σn (ω) ↑ ∞ , .