Calibration Invariance of the MaxEnt Distribution in the Maximum Entropy Principle

. 2021 Jan 11 ; 23 (1) : . [epub] 20210111

Status PubMed-not-MEDLINE Jazyk angličtina Země Švýcarsko Médium electronic

Typ dokumentu časopisecké články

Perzistentní odkaz   https://www.medvik.cz/link/pmid33440777

Grantová podpora
I 3073 Austrian Science Fund FWF - Austria
882184 Österreichische Forschungsförderungsgesellschaft
3073 Austrian Science Fund
19-16066S Grantová Agentura České Republiky

The maximum entropy principle consists of two steps: The first step is to find the distribution which maximizes entropy under given constraints. The second step is to calculate the corresponding thermodynamic quantities. The second part is determined by Lagrange multipliers' relation to the measurable physical quantities as temperature or Helmholtz free energy/free entropy. We show that for a given MaxEnt distribution, the whole class of entropies and constraints leads to the same distribution but generally different thermodynamics. Two simple classes of transformations that preserve the MaxEnt distributions are studied: The first case is a transform of the entropy to an arbitrary increasing function of that entropy. The second case is the transform of the energetic constraint to a combination of the normalization and energetic constraints. We derive group transformations of the Lagrange multipliers corresponding to these transformations and determine their connections to thermodynamic quantities. For each case, we provide a simple example of this transformation.

Zobrazit více v PubMed

Jaynes E.T. Information Theory and Statistical Mechanics. Phys. Rev. 1957;106:620. doi: 10.1103/PhysRev.106.620. DOI

Jaynes E.T. Information Theory and Statistical Mechanics. II. Phys. Rev. 1957;108:171. doi: 10.1103/PhysRev.108.171. DOI

Burg J.P. The relationship between maximum entropy spectra and maximum likelihood spectra. Geophysics. 1972;37:375–376. doi: 10.1190/1.1440265. DOI

Rényi A. Selected Papers of Alfréd Rényi. Volume 2 Akademia Kiado; Budapest, Hungary: 1976.

Havrda J.H., Charvát F. Quantification Method of Classification Processes. Concept of Structural α-Entropy. Kybernetika. 1967;3:30–35.

Sharma B.D., Mitter J., Mohan M. On Measures of “Useful” Information. Inf. Control. 1978;39:323–336. doi: 10.1016/S0019-9958(78)90671-X. DOI

Tsallis C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988;52:479–487. doi: 10.1007/BF01016429. DOI

Frank T., Daffertshofer A. Exact time-dependent solutions of the Renyi Fokker-Planck equation and the Fokker-Planck equations related to the entropies proposed by Sharma and Mittal. Physica A. 2000;285:351–366. doi: 10.1016/S0378-4371(00)00178-3. DOI

Kaniadakis G. Statistical mechanics in the context of special relativity. Phys. Rev. E. 2002;66:056125. doi: 10.1103/PhysRevE.66.056125. PubMed DOI

Jizba P., Arimitsu T. The world according to Rényi: Thermodynamics of multifractal systems. Ann. Phys. 2004;312:17–59. doi: 10.1016/j.aop.2004.01.002. DOI

Hanel R., Thurner S. A comprehensive classification of complex statistical systems and an ab-initio derivation of their entropy and distribution functions. Europhys. Lett. 2011;93:20006. doi: 10.1209/0295-5075/93/20006. DOI

Thurner S., Hanel R., Klimek P. Introduction to the Theory of Complex Systems. Oxford University Press; Oxford, UK: 2018.

Korbel J., Hanel R., Thurner S. Classification of complex systems by their sample-space scaling exponents. New J. Phys. 2018;20:093007. doi: 10.1088/1367-2630/aadcbe. DOI

Tempesta P., Jensen H.J. Universality classes and information-theoretic Measures of complexity via Group entropies. Sci. Rep. 2020;10:1–11. doi: 10.1038/s41598-020-60188-y. PubMed DOI PMC

Ilić V.M., Stankovixcx M.S. Generalized Shannon-Khinchin axioms and uniqueness theorem for pseudo-additive entropies. Physica A. 2014;411:138–145. doi: 10.1016/j.physa.2014.05.009. DOI

Ilić V.M., Scarfone A.M., Wada T. Equivalence between four versions of thermostatistics based on strongly pseudoadditive entropies. Phys. Rev. E. 2019;100:062135. doi: 10.1103/PhysRevE.100.062135. PubMed DOI

Czachor M. Unifying Aspects of Generalized Calculus. Entropy. 2020;22:1180. doi: 10.3390/e22101180. PubMed DOI PMC

Beck C., Schlögl F. Thermodynamics of Chaotic Systems: An Introduction. Cambridge University Press; Cambridge, UK: 1993.

Abe S. Geometry of escort distributions. Phys. Rev. E. 2003;68:031101. doi: 10.1103/PhysRevE.68.031101. PubMed DOI

Bercher J.-F. On escort distributions, q-gaussians and Fisher information. AIP Conf. Proc. 2011;1305:208.

Czachor M., Naudts J. Thermostatistics based on Kolmogorov-Nagumo averages: Unifying framework for extensive and nonextensive generalizations. Phys. Lett. A. 2002;298:369–374. doi: 10.1016/S0375-9601(02)00540-6. DOI

Scarfone A.M., Matsuzoe H., Wada T. Consistency of the structure of Legendre transform in thermodynamics with the Kolmogorov-Nagumo average. Phys. Lett. A. 2016;380:3022–3028. doi: 10.1016/j.physleta.2016.07.012. DOI

Bercher J.-F. Tsallis distribution as a standard maximum entropy solution with ‘tail’ constraint. Phys. Lett. A. 2008;372:5657–5659. doi: 10.1016/j.physleta.2008.06.088. DOI

Pressé S., Ghosh K., Lee J., Dill K.A. Nonadditive Entropies Yield Probability Distributions with Biases not Warranted by the Data. Phys. Rev. Lett. 2013;111:180604. doi: 10.1103/PhysRevLett.111.180604. PubMed DOI

Oikonomou T., Bagci B. Misusing the entropy maximization in the jungle of generalized entropies. Phys. Lett. A. 2017;381:207–211. doi: 10.1016/j.physleta.2016.11.005. DOI

Tsallis C., Mendes R.S., Plastino A.R. The role of constraints within generalized nonextensive statistics. Phys. A. 1998;286:534–554. doi: 10.1016/S0378-4371(98)00437-3. DOI

Martínez S., Nicolás F., Peninni F., Plastino A. Tsallis’ entropy maximization procedure revisited. Phys. A. 2000;286:489–502. doi: 10.1016/S0378-4371(00)00359-9. DOI

Plastino A., Plastino A.R. On the universality of thermodynamics’ Legendre transform structure. Phys. Lett. A. 1997;226:257–263. doi: 10.1016/S0375-9601(96)00942-5. DOI

Rama S.K. Tsallis Statistics: Averages and a Physical Interpretation of the Lagrange Multiplier β. Phys. Lett. A. 2000;276:103–108. doi: 10.1016/S0375-9601(00)00634-4. DOI

Campisi M., Bagci G.B. Tsallis Ensemble as an Exact Orthode. Phys. Lett. A. 2007;362:11–15. doi: 10.1016/j.physleta.2006.09.081. DOI

Dixit P.D., Wagoner J., Weistuch C., Pressé S., Ghosh K., Dill K.A. Perspective: Maximum caliber is a general variational principle for dynamical systems. J. Chem. Phys. 2018;148:010901. doi: 10.1063/1.5012990. PubMed DOI

Lucia U. Stationary Open Systems: A Brief Review on Contemporary Theories on Irreversibility. Physica A. 2013;392:1051–1062. doi: 10.1016/j.physa.2012.11.027. DOI

Palazzo P. Hierarchical Structure of Generalized Thermodynamic and Informational Entropy. Entropy. 2018;20:553. doi: 10.3390/e20080553. PubMed DOI PMC

Shore J.E., Johnson R.W. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Trans. Inf. Theor. 1980;26:26–37. doi: 10.1109/TIT.1980.1056144. DOI

Shore J.E., Johnson R.W. Properties of cross-entropy minimization. IEEE Trans. Inf. Theor. 1981;27:472–482. doi: 10.1109/TIT.1981.1056373. DOI

Uffink J. Can the maximum entropy principle be explained as a consistency requirement? Stud. Hist. Philos. Mod. Phys. 1995;26:223–261. doi: 10.1016/1355-2198(95)00015-1. DOI

Tsallis C. Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems. Entropy. 2015;17:2853–2861. doi: 10.3390/e17052853. DOI

Pressé S., Ghosh K., Lee J., Dill K.A. Reply to C. Tsallis’ “Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems”. Entropy. 2015;17:5043–5046. doi: 10.3390/e17075043. DOI

Oikonomou T., Bagci G.B. Rényi entropy yields artificial biases not in the data and incorrect updating due to the finite-size data. Phys. Rev. E. 2019;99:032134. doi: 10.1103/PhysRevE.99.032134. PubMed DOI

Jizba P., Korbel J. Comment on “Rényi entropy yields artificial biases not in the data and incorrect updating due to the finite-size data”. Phys. Rev. E. 2019;100:026101. doi: 10.1103/PhysRevE.100.026101. PubMed DOI

Oikonomou T., Bagci G.B. Reply to “Comment on Rényi entropy yields artificial biases not in the data and incorrect updating due to the finite-size data”. Phys. Rev. E. 2019;100:026102. doi: 10.1103/PhysRevE.100.026102. PubMed DOI

Jizba P., Korbel J. Maximum Entropy Principle in Statistical Inference: Case for Non-Shannonian Entropies. Phys. Rev. Lett. 2019;122:120601. doi: 10.1103/PhysRevLett.122.120601. PubMed DOI

Jizba P., Korbel J. When Shannon and Khinchin meet Shore and Johnson: Equivalence of information theory and statistical inference axiomatics. Phys. Rev. E. 2020;101:042126. doi: 10.1103/PhysRevE.101.042126. PubMed DOI

Plastino A., Plastino A.R. Tsallis Entropy and Jaynes’ Information Theory Formalism. Braz. J. Phys. 1999;29:50–60. doi: 10.1590/S0103-97331999000100005. DOI

Naudts J. Generalized Thermostatistics. Springer; London, UK: 2011.

Biró T.S., Ván P. Zeroth law compatibility of nonadditive thermodynamics. Phys. Rev. E. 2011;83:061147. doi: 10.1103/PhysRevE.83.061147. PubMed DOI

Wada T., Scarfone A.M. Connections between Tsallis’ formalisms employing the standard linear average energy and ones employing the normalized q-average energy. Phys. Lett. A. 2005;335:351–362. doi: 10.1016/j.physleta.2004.12.054. DOI

Nejnovějších 20 citací...

Zobrazit více v
Medvik | PubMed

The Statistical Foundations of Entropy

. 2021 Oct 19 ; 23 (10) : . [epub] 20211019

Najít záznam

Citační ukazatele

Nahrávání dat ...

Možnosti archivace

Nahrávání dat ...