Entropy and information rates for hidden Markov models

Hanseok Ko, R. H. Baran

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

A practical approach to statistical inference for hidden Markov models (HMMs) requires expressions for the mean and variance of the log-probability of an observed T-long sequence given the model parameters. From the viewpoint of Shannon theory, in the limit of large T, the expected value of the per step log-probability is minus one times the mean entropy rate at the output of a noisy channel driven by the Markov source. A novel procedure for finding the entropy rate is presented. The rate distortion function of the Markov source, subject to the requirement of instantaneous coding, is a by-product of the derivation.

Original languageEnglish
Title of host publicationProceedings - 1998 IEEE International Symposium on Information Theory, ISIT 1998
Number of pages1
DOIs
Publication statusPublished - 1998
Event1998 IEEE International Symposium on Information Theory, ISIT 1998 - Cambridge, MA, United States
Duration: 1998 Aug 161998 Aug 21

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
ISSN (Print)2157-8095

Other

Other1998 IEEE International Symposium on Information Theory, ISIT 1998
Country/TerritoryUnited States
CityCambridge, MA
Period98/8/1698/8/21

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Modelling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Entropy and information rates for hidden Markov models'. Together they form a unique fingerprint.

Cite this