Entropy and relative entropy from information-theoretic principles
- Publication Type:
- Journal Article
- Citation:
- IEEE Trans. Inf. Theory, 2020, 67, pp. 6313-6327
- Issue Date:
- 2020-06-20
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
2006.11164v2.pdf | 912.19 kB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
We introduce an axiomatic approach to entropies and relative entropies that
relies only on minimal information-theoretic axioms, namely monotonicity under
mixing and data-processing as well as additivity for product distributions. We
find that these axioms induce sufficient structure to establish continuity in
the interior of the probability simplex and meaningful upper and lower bounds,
e.g., we find that every relative entropy must lie between the R\'enyi
divergences of order $0$ and $\infty$. We further show simple conditions for
positive definiteness of such relative entropies and a characterisation in term
of a variant of relative trumping. Our main result is a one-to-one
correspondence between entropies and relative entropies.
Please use this identifier to cite or link to this item: