Summary: | Information entropy is a measure of the average rate of information produced by a random data source, with representations for both classical and quantum information theory. The classical Shannon, Rényi, and Tsallis entropies are used for many applications in fields such as cryptology, financial interpretations, signal processing, water engineering, and image processing. One key feature of quantum information theory is quantum entanglement, which has applications in quantum teleportation, quantum cryptography, and super-dense coding. To broaden applications of quantum entropy, a new, computationally easier representation for von Neumann entropy is introduced and generalised to Rényi and Tsallis entropies. The Shannon entropy is then applied to virus analysis, using SARS-CoV as a reference to determine possible binding sites for the Coronavirus Disease 2019 (COVID-19), aka SARS-CoV-2, and Angiotensin-Converting Enzyme 2 (ACE2).
|