The theory of informity: a novel probability framework

Authors

DOI:

https://doi.org/10.17721/1812-5409.2025/1.7

Keywords:

informity, information content, probability, probability distribution

Abstract

This paper proposes a novel probability framework, called the theory of informity. We define a mathematical quantity called "informity" to quantitatively measure the degree of informativeness of a probability distribution (or a probability system). We also define two other quantities: cross-informity and joint informity. We propose an informity metric that can be used as an alternative to entropy metric. The informities for twelve continuous distributions are given. Three examples are presented to demonstrate the practicability of the proposed informity metric.

Pages of the article in the issue: 53 - 59

Language of the article: English

References

Castrup, H. (2004). Selecting and applying error distributions in uncertainty analysis. Measurement Science Conference. Anaheim. http://www.isgmax.com/Articles_Papers/Selecting%20and%20Applying%20Error%20Distributions.pdf

Ellerman, D. (2013). An Introduction to Logical Entropy and its Relation to Shannon Entropy. International Journal of Semantic Computing, 7(2), 121–145. https://doi.org/10.1142/S1793351X13400059

Ellerman, D. (2022). Introduction to logical entropy and its relationship to Shannon entropy. 4open, 5, 1–33. https://doi.org/10.1051/fopen/2021004

Huang, H. (2024). A minimum entropy criterion for distribution selection for measurement uncertainty analysis. Measurement Science and Technology, 35, 035014. https://iopscience.iop.org/article/10.1088/1361-6501/ad1476

Lad, F., Sanfilippo, G. & Agrò, G. (2015). Extropy: Complementary Dual of Entropy. Statistical Science, 30(1), 40–58. https://doi.org/10.1214/14-STS430

Petty, G. W. (2018). On some shortcomings of Shannon entropy as a measure of information content in indirect measurements of continuous variables. Journal of Atmospheric and Oceanic Technology, 35(5), 1011–1021. https://doi.org/10.1175/JTECH-D-17-0056.1

Rousseau, R. (2018). The repeat rate: from Hirschman to Stirling. Scientometrics, 116, 645–653. https://doi.org/10.1007/s11192-018-2724-8

Shannon, C. E. (1948). A mathematical theory of communications. The Bell System Technical Journal, 27(3), 379–423.

Schroeder, M. J. (2004). An alternative to entropy in the measurement of information. Entropy, 6, 388–412.

Sussmann, G. (1997). Uncertainty relation: from inequality to equality. Zeitschrift fur Naturforschung A, 52, 1–2. https://doi.org/10.1515/zna-1997-1-214

Wadsworth, Jr. H. M. (1989). Summarization and interpretation of data. In Harrison M Wadsworth Jr. (Ed.), Handbook of Statistical Methods for Engineers and Scientists (2.1–2.21). McGRAW-HILL In.

Xie, G. (2011). Further developments of two point process models for fine-scale time series. Massey University, New Zealand.

Zhou, V. (2019a). A simple explanation of information gain and entropy. https://victorzhou.com/blog/information-gain

Zhou, V. (2019b). A simple explanation of Gini impurity. https://victorzhou.com/blog/gini-impurity

Downloads

Published

2025-07-07

Issue

Section

Algebra, Geometry and Probability Theory

How to Cite

Huang, H. (2025). The theory of informity: a novel probability framework. Bulletin of Taras Shevchenko National University of Kyiv. Physical and Mathematical Sciences, 80(1), 53-59. https://doi.org/10.17721/1812-5409.2025/1.7