Month: July 2020

Coherent dynamics enhanced by uncorrelated noise

Zachary G. Nicolaou, Michael Sebek, István Z. Kiss, and Adilson E. Motter

Phys. Rev. Lett.

 

Synchronization is a widespread phenomenon observed in physical, biological, and social networks, which persists even under the influence of strong noise. Previous research on oscillators subject to common noise has shown that noise can actually facilitate synchronization, as correlations in the dynamics can be inherited from the noise itself. However, in many spatially distributed networks, such as the mammalian circadian system, the noise that different oscillators experience can be effectively uncorrelated. Here, we show that uncorrelated noise can in fact enhance synchronization when the oscillators are coupled together. Strikingly, our analysis also shows that uncorrelated noise can be more effective than common noise in enhancing synchronization. We first establish these results theoretically for phase and phase-amplitude oscillators subject to either or both additive and multiplicative noise. We then confirm the predictions through experiments on coupled electrochemical oscillators. Our findings suggest that uncorrelated noise can promote rather than inhibit coherence in natural systems and that the same effect can be harnessed in engineered systems.

Source: journals.aps.org

Compressing Phase Space Detects State Changes in Nonlinear Dynamical Systems

Valeria d’Andrea and Manlio De Domenico

Complexity

Volume 2020 Article ID 8650742

 

Equations governing the nonlinear dynamics of complex systems are usually unknown, and indirect methods are used to reconstruct their manifolds. In turn, they depend on embedding parameters requiring other methods and long temporal sequences to be accurate. In this paper, we show that an optimal reconstruction can be achieved by lossless compression of system’s time course, providing a self-consistent analysis of its dynamics and a measure of its complexity, even for short sequences. Our measure of complexity detects system’s state changes such as weak synchronization phenomena, characterizing many systems, in one step, integrating results from Lyapunov and fractal analysis.

Source: www.hindawi.com

Predicting the number of viable autocatalytic sets in systems that combine catalysis and inhibition

Stuart Kauffman, Mike Steel

 

The emergence of self-sustaining autocatalytic networks in chemical reaction systems has been studied as a possible mechanism for modelling how living systems first arose. It has been known for several decades that such networks will form within systems of polymers (under cleavage and ligation reactions) under a simple process of random catalysis, and this process has since been mathematically analysed. In this paper, we provide an exact expression for the expected number of self-sustaining autocatalytic networks that will form in a general chemical reaction system, and the expected number of these networks that will also be uninhibited (by some molecule produced by the system). Using these equations, we are able to describe the patterns of catalysis and inhibition that maximise or minimise the expected number of such networks. We apply our results to derive a general theorem concerning the trade-off between catalysis and inhibition, and to provide some insight into the extent to which the expected number of self-sustaining autocatalytic networks coincides with the probability that at least one such system is present.

Source: arxiv.org

A Fourth Law of Thermodynamics: Synergy Increases Free Energy While Decreasing Entropy

Klaus Jaffe

 

Synergy, emerges from synchronized reciprocal positive feedback loops between a network of diverse actors. For this process to proceed, compatible information from different sources synchronically coordinates the actions of the actors resulting in a nonlinear increase in the useful work or potential energy the system can manage. In contrast noise is produced when incompatible information is mixed. This synergy produced from the coordination of different agents achieves non-linear gains in free energy and in information (negentropy) that are greater than the sum of the parts. The final product of new synergies is an increase in individual autonomy of an organism that achieves increased emancipation from the environment with increases in productivity, efficiency, capacity for flexibility, self-regulation and self-control of behavior through a synchronized division of ever more specialized labor. Examples that provide quantitative data for this phenomenon are presented. Results show that increases in free energy density require decreases in entropy density. This is proposed as a law of thermodynamics.

Source: www.preprints.org