Use of Relative Entropy Statistics in Contingency Tables
Chapter from the book: Akoğul, S. & Tuna, E. (eds.) 2024. Academic Studies with Current Econometric and Statistical Applications.

Atıf Evren
Yildiz Technical University
Erhan Ustaoğlu
Marmara University
Elif Tuna
Yildiz Technical University
Büşra Şahin
Haliç University

Synopsis

There are various information theoretic divergence measures used for determining associations between nominal variables.  Among them, Shannon mutual information statistic is especially appealing, since its sampling properties are well-known. Although Shannon mutual information is more frequently used, Rényi and Tsallis mutual informations, as envelopes of various tools, provide much higher flexibility than Shannon mutual information. Indeed, Shannon mutual information is a special case of Kullback-Leibler divergence, Rényi, and Tsallis mutual informations. In this study, large sampling properties of Shannon, Rényi, Tsallis mutual information statistics are considered as well as Pearson, Tschuprow, Sakoda, Cramér, Hellinger, and Bhattacharyya measures.  In simulations, the normality of most of the statistics, and the higher positive correlation coefficients between all these tools are observed. Their sampling variabilities are compared. Then by using Rényi and Tsallis mutual information statistics, correlation coefficients are estimated for 8 different scenarios, and 3 bivariate normal distributions.

How to cite this book

Evren, A. & Ustaoğlu, E. & Tuna, E. & Şahin, B. (2024). Use of Relative Entropy Statistics in Contingency Tables. In: Akoğul, S. & Tuna, E. (eds.), Academic Studies with Current Econometric and Statistical Applications. Özgür Publications. DOI: https://doi.org/10.58830/ozgur.pub518.c2132

License

Published

November 26, 2024

DOI