Zum Hauptinhalt springen
Nicht aus der Schweiz? Besuchen Sie lehmanns.de

On Rényi Information Measures and Their Applications

(Autor)

Amos Lapidoth (Herausgeber)

Buch
X, 174 Seiten
2020 | 2020
Hartung-Gorre (Verlag)
9783866286634 (ISBN)

Lese- und Medienproben

On Rényi Information Measures and Their Applications - Christoph Pfister
CHF 89,60 inkl. MwSt
  • Keine Verlagsinformationen verfügbar
  • Artikel merken
The solutions to many problems in information theory can be expressed using Shannon’s information measures such as entropy, relative entropy, and mutual information. Other problems—for example source coding with exponential cost functions, guessing, and task encoding—require Rényi’s information measures, which generalize Shannon’s.

The contributions of this thesis are as follows: new problems related to guessing, task encoding, hypothesis testing, and horse betting are solved; and two new Rényi measures of dependence and a new conditional Rényi divergence appearing in these problems are analyzed.

The two closely related families of Rényi measures of dependence are studied in detail, and it is shown that they share many properties with Shannon’s mutual information, but the data-processing inequality is only satisfied by one of them. The dependence measures are based on the Rényi divergence and the relative α-entropy, respectively.

The new conditional Rényi divergence is compared with the conditional Rényi divergences that appear in the definitions of the dependence measures by Csiszár and Sibson, and the properties of all three are studied with emphasis on their behavior under data processing. In the same way that Csiszár’s and Sibson’s conditional divergence lead to the respective dependence measures, so does the new conditional divergence lead to the first of the new Rényi dependence measures. Moreover, the new conditional divergence is also related to Arimoto’s measures.

The first solved problem is about guessing with distributed encoders: Two dependent sequences are described separately and, based on their descriptions, have to be determined by a guesser. The description rates are characterized for which the expected number of guesses until correct (or, more generally, its ρ-th moment for some positive ρ) can be driven to one as the length of the sequences tends to infinity. The characterization involves the Rényi entropy and the Arimoto–Rényi conditional entropy.

The related second problem is about distributed task encoding: Two dependent sequences are described separately, and a decoder produces a list of all the sequence pairs that share the given descriptions. The description rates are characterized for which the expected size of this list (or, more generally, its ρ-th moment for some positive ρ) can be driven to one as the length of the sequences tends to infinity. The characterization involves the Rényi entropy and the second of the new Rényi dependence measures.

The third problem is about a hypothesis testing setup where the observation consists of independent and identically distributed samples from either a known joint probability distribution or an unknown product distribution. The achievable error-exponent pairs are established and the Fenchel biconjugate of the error-exponent function is related to the first of the new Rényi dependence measures. Moreover, an example is provided where the error-exponent function is not convex and thus not equal to its Fenchel biconjugate.

The fourth problem is about horse betting, where, instead of Kelly’s expected log-wealth criterion, a more general family of power-mean utility functions is considered. The key role in the analysis is played by the Rényi divergence, and the setting where the gambler has access to side information motivates the new conditional Rényi divergence. The proposed family of utility functions in the context of gambling with side information also provides another operational meaning to the first of the new Rényi dependence measures. Finally, a universal strategy for independent and identically distributed races is presented that asymptotically maximizes the gambler’s utility function without knowing the winning probabilities or the parameter of the utility function.
Erscheinungsdatum
Reihe/Serie ETH Series in Information Theory and its Applications ; 10
Verlagsort Konstanz
Sprache englisch
Maße 148 x 210 mm
Gewicht 220 g
Themenwelt Mathematik / Informatik Informatik Netzwerke
Mathematik / Informatik Informatik Weitere Themen
Schlagworte Arimoto–Rényi conditional entropy • dependence measures by Csiszár and Sibson • horse betting • hypothesis testing • new conditional Rényi divergence • Rényi entropy • Rényi’s information measures • Shannon’s information measures • task encoding
ISBN-13 9783866286634 / 9783866286634
Zustand Neuware
Informationen gemäß Produktsicherheitsverordnung (GPSR)
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich