Semantic information, agency, and nonequilibrium statistical physics

Information theory provides various measures of correlations holding between the states of two systems, which are sometimes called measures of "syntactic information". At the same time, the concept of "semantic information" refers to information which is in some sense meaningful rather than merely correlational. Semantic information plays an important role in many fields — including biology, cognitive science, artificial intelligence — and there has been a long-standing interest in a quantitative theory of semantic information. In this work, we introduce such a theory, which defines semantic information as the syntactic information that a physical system has about its environment that is causally necessary for the system to maintain its own existence. We operationalize self-maintenance in terms of the ability of the system to maintain a low entropy state, which we use to make connections to results in nonequilibrium statistical physics. Our approach leads naturally to formal definitions of notions like "value of information", "semantic content", and "agency". Our approach is grounded purely in the intrinsic dynamics of a system coupled to some environment, and is applicable to any physical system.

Semantic information, agency, and nonequilibrium statistical physics
Artemy Kolchinsky, David H. Wolpert

Source: arxiv.org