Information Theory and Statistical Mechanics. AND Information Theory and Statistical Mechanics. II.
Information Theory and Statistical Mechanics. AND Information Theory and Statistical Mechanics. II.
Metodi di Pagamento
- PayPal
- Carta di Credito
- Bonifico Bancario
- Pubblica amministrazione
- Carta del Docente
Dettagli
- Autore
- E. T. Jaynes.
- Edizione
- First editions.
- Descrizione
- Original printed wrappers.
- Lingue
- Italiano
- Prima edizione
- True
Descrizione
In THE PHYSICAL REVIEW, Second Series, Vol. 106, N. 4, pp. 620-630, and Vol. 108, N. 2, pp. 171-190, two entire issues in original printed wrappers. Very fine copies, ownership inscriptions. FIRST EDITIONS. In 1957 Jaynes published his first articles in information theory, ``Information Theory and Statistical Mechanics,' [9,10]. In these two articles Jaynes reformulated statistical mechanics in terms of probability distributions derived by the use of the principle of maximum entropy. This reformulation of the theory simplified the mathematics, allowed for fundamental extensions of the theory, and reinterpreted statistical mechanics as inference based on incomplete information. These articles were published over the objection of a reviewer. (Jaynes comments on this review in ``Where do we Stand on Maximum Entropy,' ( http://bayes.wustl.edu/etj/etj.html Edwin Thompson Jaynes
July 5, 1922 - April 30, 1998 By: G. Larry Bretthorst ). In physics, maximum entropy thermodynamics (colloquially, MaxEnt thermodynamics) views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon information theory, Bayesian probability, and the principle of maximum entropy. These techniques are relevant to any situation requiring prediction from incomplete or insufficient data (e.g., image reconstruction, signal processing, spectral analysis, and inverse problems). MaxEnt thermodynamics began with two papers by Edwin T. Jaynes published in the 1957 Physical Review. ( Wikpedia ).