DOI: 10.7763/IJCCE.2013.V2.172
Information Entropy Estimation Using Indirect Conditional Information
Abstract—Conditional entropy measures information flow from one discrete random to another. Conventional entropy is calculated from conditional or joint probabilities of two variables. When the joint probability is not known, the conditional entropy cannot be calculated. This paper presents an approach to estimate the conditional entropy using indirect conditional information. The result of the estimation is overestimated by less than ten percent of the actual value for normal cases.
Index Terms—Entropy, conditional entropy, information flow, indirect conditional information.
Nol Premasathian is with the Faculty of Information Technology, King Mongkut’s Institute of Technology Ladkrabang, Bangkok, Thailand (e-mail:nol@it.kmitl.ac.th).
Watcharee Tantikittipisut is with the Bangkok, Thailand (e-mail:charlieante@gmail.com).
Cite: Nol Premasathian and Watcharee Tantikittipiisut, "Information Entropy Estimation Using Indirect Conditional Information," International Journal of Computer and Communication Engineering vol. 2, no. 2, pp. 210-212 , 2013.
General Information
-
Dec 29, 2021 News!
IJCCE Vol. 10, No. 1 - Vol. 10, No. 2 have been indexed by Inspec, created by the Institution of Engineering and Tech.! [Click]
-
Mar 17, 2022 News!
IJCCE Vol.11, No.2 is published with online version! [Click]
-
Dec 29, 2021 News!
The dois of published papers in Vol. 9, No. 3 - Vol. 10, No. 4 have been validated by Crossref.
-
Dec 29, 2021 News!
IJCCE Vol.11, No.1 is published with online version! [Click]
-
Sep 16, 2021 News!
IJCCE Vol.10, No.4 is published with online version! [Click]
- Read more>>