Mutual information between discrete variables with many categories using recursive adaptive partitioning

Junhee Seok, Yeong Seon Kang

Research output: Contribution to journalArticlepeer-review

10 Citations (Scopus)


Mutual information, a general measure of the relatedness between two random variables, has been actively used in the analysis of biomedical data. The mutual information between two discrete variables is conventionally calculated by their joint probabilities estimated from the frequency of observed samples in each combination of variable categories. However, this conventional approach is no longer efficient for discrete variables with many categories, which can be easily found in large-scale biomedical data such as diagnosis codes, drug compounds, and genotypes. Here, we propose a method to provide stable estimations for the mutual information between discrete variables with many categories. Simulation studies showed that the proposed method reduced the estimation errors by 45 folds and improved the correlation coefficients with true values by 99 folds, compared with the conventional calculation of mutual information. The proposed method was also demonstrated through a case study for diagnostic data in electronic health records. This method is expected to be useful in the analysis of various biomedical data with discrete variables.

Original languageEnglish
Article number10981
JournalScientific reports
Publication statusPublished - 2015 Jun 5

Bibliographical note

Funding Information:
This work was supported by the National Research Foundation of Korea grant (No. NRF-2014R1A1A2A16050527) and the Korea University Grant. We thank Mr. Minhyeok Lee for his valuable comments on the proposed method.

ASJC Scopus subject areas

  • General


Dive into the research topics of 'Mutual information between discrete variables with many categories using recursive adaptive partitioning'. Together they form a unique fingerprint.

Cite this