Information density, structure and entropy in equilibrium and non-equilibrium systems

Information density, structure and entropy in equilibrium and non-equilibrium systems

During a spontaneous change, a macroscopic physical system will evolve towards a macro-state with more realizations. This observation is at the basis of the statistical mechanical version of the second law of thermodynamics, and it provides an interpretation of entropy in terms of probabilities. However, we cannot rely on the statistical-mechanical expressions for entropy in systems that are far from equilibrium. In this paper, we compare various extensions of the definition of entropy, which have been proposed for non-equilibrium systems. It has recently been proposed that measures of information density may serve to quantify entropy in both equilibrium and non-equilibrium systems. We propose a new 'bit-wise' method to measure the information density for off-lattice systems. This method does not rely on coarse-graining of the particle coordinates. We then compare different estimates of the system entropy, based on information density and on the structural properties of the system, and check if the various entropies are mutually consistent and, importantly, whether they can detect non-trivial ordering phenomena. We find that, except for simple (one-dimensional) cases, the different methods yield answers that are at best qualitatively similar, and often not even that, although in several cases, the different entropy estimates do detect ordering phenomena qualitatively. Our entropy estimates based on bit-wise data compression contain no adjustable scaling factor, and show large quantitative differences with the thermodynamic entropy obtained from equilibrium simulations. Hence, our results suggest that, at present, there is not yet a single, structure-based entropy definition that has general validity for equilibrium and non-equilibrium systems.