Use of Relative Entropy Statistics in Contingency Tables
Şu kitabın bölümü: Akoğul, S. & Tuna, E. (eds.) 2024. Güncel Ekonometrik ve İstatistiksel Uygulamalar ile Akademik Çalışmalar.

Atıf Evren
Yıldız Teknik Üniversitesi
Erhan Ustaoğlu
Marmara Üniversitesi
Elif Tuna
Yıldız Teknik Üniversitesi
Büşra Şahin
Haliç Üniversitesi

Özet

There are various information theoretic divergence measures used for determining associations between nominal variables.  Among them, Shannon mutual information statistic is especially appealing, since its sampling properties are well-known. Although Shannon mutual information is more frequently used, Rényi and Tsallis mutual informations, as envelopes of various tools, provide much higher flexibility than Shannon mutual information. Indeed, Shannon mutual information is a special case of Kullback-Leibler divergence, Rényi, and Tsallis mutual informations. In this study, large sampling properties of Shannon, Rényi, Tsallis mutual information statistics are considered as well as Pearson, Tschuprow, Sakoda, Cramér, Hellinger, and Bhattacharyya measures.  In simulations, the normality of most of the statistics, and the higher positive correlation coefficients between all these tools are observed. Their sampling variabilities are compared. Then by using Rényi and Tsallis mutual information statistics, correlation coefficients are estimated for 8 different scenarios, and 3 bivariate normal distributions.

Kaynakça Gösterimi

Evren, A. & Ustaoğlu, E. & Tuna, E. & Şahin, B. (2024). Use of Relative Entropy Statistics in Contingency Tables. In: Akoğul, S. & Tuna, E. (eds.), Güncel Ekonometrik ve İstatistiksel Uygulamalar ile Akademik Çalışmalar. Özgür Yayınları. DOI: https://doi.org/10.58830/ozgur.pub518.c2132

Lisans

Yayın Tarihi

26 November 2024

DOI

Kategoriler