Infodynamics, Information Entropy and the Second Law of Thermodynamics

Klaus Jaffe

Information and Energy are related. The Second Law of Thermodynamics applies to changes in energy and heat, but it does not apply to information dynamics. Advances in Infodynamics have made it clear that Total Information contains Useful Information and Noise, both of which may be gained or lost in irreversible processes. Increases in Free Energy of open systems require more Useful Information, reducing or increasing Thermodynamic Entropy. Empirical data show that the more Free Energy is created, the more Useful Information is required; and the more Useful Information is produced the more Free Energy is spent. The Energy – Information relationship underlies all processes where novel structures, forms and systems emerge. Although science cannot predict the structure of information that will produce Free Energy, engineers have been successful in finding Useful Information that increases Free Energy. Here I explore the fate of information in irreversible processes and its relation with the Second Law of Thermodynamics.

Read the full article at: www.qeios.com

Thermodynamics of Computations with Absolute Irreversibility, Unidirectional Transitions, and Stochastic Computation Times

Gonzalo Manzano, Gülce Kardeş, Édgar Roldán, and David H. Wolpert
Phys. Rev. X 14, 021026

Developing a thermodynamic theory of computation is a challenging task at the interface of nonequilibrium thermodynamics and computer science. In particular, this task requires dealing with difficulties such as stochastic halting times, unidirectional (possibly deterministic) transitions, and restricted initial conditions, features common in real-world computers. Here, we present a framework which tackles all such difficulties by extending the martingale theory of nonequilibrium thermodynamics to generic nonstationary Markovian processes, including those with broken detailed balance and/or absolute irreversibility. We derive several universal fluctuation relations and second-law-like inequalities that provide both lower and upper bounds on the intrinsic dissipation (mismatch cost) associated with any periodic process—in particular, the periodic processes underlying all current digital computation. Crucially, these bounds apply even if the process has stochastic stopping times, as it does in many computational machines. We illustrate our results with exhaustive numerical simulations of deterministic finite automata processing bit strings, one of the fundamental models of computation from theoretical computer science. We also provide universal equalities and inequalities for the acceptance probability of words of a given length by a deterministic finite automaton in terms of thermodynamic quantities, and outline connections between computer science and stochastic resetting. Our results, while motivated from the computational context, are applicable far more broadly.

Read the full article at: link.aps.org

Challenges and opportunities for digital twins in precision medicine: a complex systems perspective

Manlio De Domenico, Luca Allegri, Guido Caldarelli, Valeria d’Andrea, Barbara Di Camillo, Luis M. Rocha, Jordan Rozum, Riccardo Sbarbati, Francesco Zambelli

The adoption of digital twins (DTs) in precision medicine is increasingly viable, propelled by extensive data collection and advancements in artificial intelligence (AI), alongside traditional biomedical methodologies. However, the reliance on black-box predictive models, which utilize large datasets, presents limitations that could impede the broader application of DTs in clinical settings. We argue that hypothesis-driven generative models, particularly multiscale modeling, are essential for boosting the clinical accuracy and relevance of DTs, thereby making a significant impact on healthcare innovation. This paper explores the transformative potential of DTs in healthcare, emphasizing their capability to simulate complex, interdependent biological processes across multiple scales. By integrating generative models with extensive datasets, we propose a scenario-based modeling approach that enables the exploration of diverse therapeutic strategies, thus supporting dynamic clinical decision-making. This method not only leverages advancements in data science and big data for improving disease treatment and prevention but also incorporates insights from complex systems and network science, quantitative biology, and digital medicine, promising substantial advancements in patient care.

Read the full article at: arxiv.org

Is it getting harder to make a hit? Evidence from 65 years of US music chart history

Marta Ewa Lech, Sune Lehmann, Jonas L. Juul

Since the creation of the Billboard Hot 100 music chart in 1958, the chart has been a window into the music consumption of Americans. Which songs succeed on the chart is decided by consumption volumes, which can be affected by consumer music taste, and other factors such as advertisement budgets, airplay time, the specifics of ranking algorithms, and more. Since its introduction, the chart has documented music consumerism through eras of globalization, economic growth, and the emergence of new technologies for music listening. In recent years, musicians and other hitmakers have voiced their worry that the music world is changing: Many claim that it is getting harder to make a hit but until now, the claims have not been backed using chart data. Here we show that the dynamics of the Billboard Hot 100 chart have changed significantly since the chart’s founding in 1958, and in particular in the past 15 years. Whereas most songs spend less time on the chart now than songs did in the past, we show that top-1 songs have tripled their chart lifetime since the 1960s, the highest-ranked songs maintain their positions for far longer than previously, and the lowest-ranked songs are replaced more frequently than ever. At the same time, who occupies the chart has also changed over the years: In recent years, fewer new artists make it into the chart and more positions are occupied by established hit makers. Finally, investigating how song chart trajectories have changed over time, we show that historical song trajectories cluster into clear trajectory archetypes characteristic of the time period they were part of. The results are interesting in the context of collective attention: Whereas recent studies have documented that other cultural products such as books, news, and movies fade in popularity quicker in recent years, music hits seem to last longer now than in the past.

Read the full article at: arxiv.org

Measuring Molecular Complexity

Louie Slocombe and Sara Imari Walker

​ACS Cent. Sci. 2024

In a scientific era focused on big data, it is easy to lose sight of the critical role of metrology─the science of measurement─in advancing fundamental science. However, most major scientific advances have been driven by progress in what we measure and how we measure it. An example is the invention of temperature, (1) where before it, we could say one thing was hotter than another but without a standardized, empirical measure we could not say how much hotter. This is not unlike the current state in discussing complexity in chemistry, (2,3) where we can say molecules are complex but lack an empirically validated standardization to confirm that one is more complex than another. In this issue of ACS Central Science, (4) a set of experiments by Leroy Cronin and co-workers conducted at the University of Glasgow aim to change this by providing a new kind of measurement with a well-defined scale, a significant step toward a metrology of complexity in chemistry. Although the concept of quantifying molecular complexity is not new itself, (3) the team leveraged principles from the recently developed theory of molecular assembly (MA) and related ideas (5) to define a rigorous concept of a scale for complexity, connected to a theory for how evolution builds complex molecules. (6,7) They show how the complexity of molecules on this scale can be inferred from standard laboratory spectroscopic techniques, including nuclear magnetic resonance (NMR), infrared (IR) spectroscopy, and tandem mass spectrometry (MS/MS). The robust validation of the inferred complexity across a multimodal suite of techniques instills confidence in the objectivity of the complexity scale proposed and the reliability of its resultant measurement.

Read the full article at: pubs.acs.org