Testing Nonlinearity with Rényi and Tsallis Mutual Information with an Application in the EKC Hypothesis
The nature of dependence between random variables has always been the subject of many statistical problems for over a century. Yet today, there is a great deal of research on this topic, especially focusing on the analysis of nonlinearity. Shannon mutual information has been considered to be the mos...
Main Authors: | Elif Tuna, Atıf Evren, Erhan Ustaoğlu, Büşra Şahin, Zehra Zeynep Şahinbaşoğlu |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-12-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/25/1/79 |
Similar Items
-
Conditional Rényi Entropy and the Relationships between Rényi Capacities
by: Gautam Aishwarya, et al.
Published: (2020-05-01) -
Tsallis Mutual Information for Document Classification
by: Màrius Vila, et al.
Published: (2011-09-01) -
A Two-Moment Inequality with Applications to Rényi Entropy and Mutual Information
by: Galen Reeves
Published: (2020-11-01) -
On a General Definition of Conditional Rényi Entropies
by: Velimir M. Ilić, et al.
Published: (2017-11-01) -
A Direct Link between Rényi–Tsallis Entropy and Hölder’s Inequality—Yet Another Proof of Rényi–Tsallis Entropy Maximization
by: Hisa-Aki Tanaka, et al.
Published: (2019-05-01)