Use of Relative Entropy Statistics in Contingency Tables
Şu kitabın bölümü:
Akoğul,
S.
&
Tuna,
E.
(eds.)
2024.
Güncel Ekonometrik ve İstatistiksel Uygulamalar ile Akademik Çalışmalar.
Özet
There are various information theoretic divergence measures used for determining associations between nominal variables. Among them, Shannon mutual information statistic is especially appealing, since its sampling properties are well-known. Although Shannon mutual information is more frequently used, Rényi and Tsallis mutual informations, as envelopes of various tools, provide much higher flexibility than Shannon mutual information. Indeed, Shannon mutual information is a special case of Kullback-Leibler divergence, Rényi, and Tsallis mutual informations. In this study, large sampling properties of Shannon, Rényi, Tsallis mutual information statistics are considered as well as Pearson, Tschuprow, Sakoda, Cramér, Hellinger, and Bhattacharyya measures. In simulations, the normality of most of the statistics, and the higher positive correlation coefficients between all these tools are observed. Their sampling variabilities are compared. Then by using Rényi and Tsallis mutual information statistics, correlation coefficients are estimated for 8 different scenarios, and 3 bivariate normal distributions.