Can the maximum entropy principle be explained as a consistency requirement?

Jos Uffink

Research output: Contribution to journalArticlepeer-review

112 Scopus citations

Abstract

The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with certain compelling consistency requirements. This paper reviews these consistency arguments and the surrounding controversy. It is shown that the uniqueness proofs are flawed, or rest on unreasonably strong assumptions. A more general class of inference rules, maximizing the so-called Rényi entropies, is exhibited which also fulfill the reasonable part of the consistency assumptions.

Original languageEnglish (US)
Pages (from-to)223-261
Number of pages39
JournalStudies in History and Philosophy of Modern Physics
Volume26
Issue number3
DOIs
StatePublished - Dec 1995

Bibliographical note

Funding Information:
Brown of the Sub-Facultyo f Philosophya t Oxford for their hospitalitya nd encouragemenTth. is work was supportedb y a grant from the British Council and the NederlandseO rganisatiev oor WetenschappelijOk nderzoek( NWO).

Fingerprint

Dive into the research topics of 'Can the maximum entropy principle be explained as a consistency requirement?'. Together they form a unique fingerprint.

Cite this