Entropy and Relative Entropy from Information-Theoretic Principles
- Publisher:
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Publication Type:
- Journal Article
- Citation:
- IEEE Transactions on Information Theory, 2021, 67, (10), pp. 6313-6327
- Issue Date:
- 2021-10-01
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
Entropy_and_Relative_Entropy_From_Information-Theoretic_Principles.pdf | Published version | 729.38 kB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
We introduce an axiomatic approach to entropies and relative entropies that relies only on minimal information-theoretic axioms, namely monotonicity under mixing and data-processing as well as additivity for product distributions. We find that these axioms induce sufficient structure to establish continuity in the interior of the probability simplex and meaningful upper and lower bounds, e.g., we find that every relative entropy satisfying these axioms must lie between the Rényi divergences of order 0 and infty . We further show simple conditions for positive definiteness of such relative entropies and a characterisation in terms of a variant of relative trumping. Our main result is a one-to-one correspondence between entropies and relative entropies.
Please use this identifier to cite or link to this item: