TY - GEN
T1 - Entropy and information rates for hidden Markov models
AU - Ko, Hanseok
AU - Baran, R. H.
PY - 1998
Y1 - 1998
N2 - A practical approach to statistical inference for hidden Markov models (HMMs) requires expressions for the mean and variance of the log-probability of an observed T-long sequence given the model parameters. From the viewpoint of Shannon theory, in the limit of large T, the expected value of the per step log-probability is minus one times the mean entropy rate at the output of a noisy channel driven by the Markov source. A novel procedure for finding the entropy rate is presented. The rate distortion function of the Markov source, subject to the requirement of instantaneous coding, is a by-product of the derivation.
AB - A practical approach to statistical inference for hidden Markov models (HMMs) requires expressions for the mean and variance of the log-probability of an observed T-long sequence given the model parameters. From the viewpoint of Shannon theory, in the limit of large T, the expected value of the per step log-probability is minus one times the mean entropy rate at the output of a noisy channel driven by the Markov source. A novel procedure for finding the entropy rate is presented. The rate distortion function of the Markov source, subject to the requirement of instantaneous coding, is a by-product of the derivation.
UR - http://www.scopus.com/inward/record.url?scp=84890370594&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84890370594&partnerID=8YFLogxK
U2 - 10.1109/ISIT.1998.708979
DO - 10.1109/ISIT.1998.708979
M3 - Conference contribution
AN - SCOPUS:84890370594
SN - 0780350006
SN - 9780780350007
T3 - IEEE International Symposium on Information Theory - Proceedings
BT - Proceedings - 1998 IEEE International Symposium on Information Theory, ISIT 1998
T2 - 1998 IEEE International Symposium on Information Theory, ISIT 1998
Y2 - 16 August 1998 through 21 August 1998
ER -