Time–information uncertainty relations in thermodynamics
No TL;DR found
Abstract
Physical systems that power motion and create structure in a fixed amount of\ntime dissipate energy and produce entropy. Whether living or synthetic, systems\nperforming these dynamic functions must balance dissipation and speed. Here, we\nshow that rates of energy and entropy exchange are subject to a speed limit --\na time-information uncertainty relation -- imposed by the rates of change in\nthe information content of the system. This uncertainty relation bounds the\ntime that elapses before the change in a thermodynamic quantity has the same\nmagnitude as its initial standard deviation. From this general bound, we\nestablish a family of speed limits for heat, work, entropy production, and\nentropy flow depending on the experimental constraints on the system. In all of\nthese inequalities, the time scale of transient dynamical fluctuations is\nuniversally bounded by the Fisher information. Moreover, they all have a\nmathematical form that mirrors the Mandelstam-Tamm version of the time-energy\nuncertainty relation in quantum mechanics. These bounds on the speed of\narbitrary observables apply to transient systems away from thermodynamic\nequilibrium, independent of the physical assumptions about the stochastic\ndynamics or their function.\n