Challenges and opportunities for digital twins in precision medicine: a complex systems perspective

Manlio De Domenico, Luca Allegri, Guido Caldarelli, Valeria d’Andrea, Barbara Di Camillo, Luis M. Rocha, Jordan Rozum, Riccardo Sbarbati, Francesco Zambelli

The adoption of digital twins (DTs) in precision medicine is increasingly viable, propelled by extensive data collection and advancements in artificial intelligence (AI), alongside traditional biomedical methodologies. However, the reliance on black-box predictive models, which utilize large datasets, presents limitations that could impede the broader application of DTs in clinical settings. We argue that hypothesis-driven generative models, particularly multiscale modeling, are essential for boosting the clinical accuracy and relevance of DTs, thereby making a significant impact on healthcare innovation. This paper explores the transformative potential of DTs in healthcare, emphasizing their capability to simulate complex, interdependent biological processes across multiple scales. By integrating generative models with extensive datasets, we propose a scenario-based modeling approach that enables the exploration of diverse therapeutic strategies, thus supporting dynamic clinical decision-making. This method not only leverages advancements in data science and big data for improving disease treatment and prevention but also incorporates insights from complex systems and network science, quantitative biology, and digital medicine, promising substantial advancements in patient care.

Read the full article at: arxiv.org

Is it getting harder to make a hit? Evidence from 65 years of US music chart history

Marta Ewa Lech, Sune Lehmann, Jonas L. Juul

Since the creation of the Billboard Hot 100 music chart in 1958, the chart has been a window into the music consumption of Americans. Which songs succeed on the chart is decided by consumption volumes, which can be affected by consumer music taste, and other factors such as advertisement budgets, airplay time, the specifics of ranking algorithms, and more. Since its introduction, the chart has documented music consumerism through eras of globalization, economic growth, and the emergence of new technologies for music listening. In recent years, musicians and other hitmakers have voiced their worry that the music world is changing: Many claim that it is getting harder to make a hit but until now, the claims have not been backed using chart data. Here we show that the dynamics of the Billboard Hot 100 chart have changed significantly since the chart’s founding in 1958, and in particular in the past 15 years. Whereas most songs spend less time on the chart now than songs did in the past, we show that top-1 songs have tripled their chart lifetime since the 1960s, the highest-ranked songs maintain their positions for far longer than previously, and the lowest-ranked songs are replaced more frequently than ever. At the same time, who occupies the chart has also changed over the years: In recent years, fewer new artists make it into the chart and more positions are occupied by established hit makers. Finally, investigating how song chart trajectories have changed over time, we show that historical song trajectories cluster into clear trajectory archetypes characteristic of the time period they were part of. The results are interesting in the context of collective attention: Whereas recent studies have documented that other cultural products such as books, news, and movies fade in popularity quicker in recent years, music hits seem to last longer now than in the past.

Read the full article at: arxiv.org

Measuring Molecular Complexity

Louie Slocombe and Sara Imari Walker

​ACS Cent. Sci. 2024

In a scientific era focused on big data, it is easy to lose sight of the critical role of metrology─the science of measurement─in advancing fundamental science. However, most major scientific advances have been driven by progress in what we measure and how we measure it. An example is the invention of temperature, (1) where before it, we could say one thing was hotter than another but without a standardized, empirical measure we could not say how much hotter. This is not unlike the current state in discussing complexity in chemistry, (2,3) where we can say molecules are complex but lack an empirically validated standardization to confirm that one is more complex than another. In this issue of ACS Central Science, (4) a set of experiments by Leroy Cronin and co-workers conducted at the University of Glasgow aim to change this by providing a new kind of measurement with a well-defined scale, a significant step toward a metrology of complexity in chemistry. Although the concept of quantifying molecular complexity is not new itself, (3) the team leveraged principles from the recently developed theory of molecular assembly (MA) and related ideas (5) to define a rigorous concept of a scale for complexity, connected to a theory for how evolution builds complex molecules. (6,7) They show how the complexity of molecules on this scale can be inferred from standard laboratory spectroscopic techniques, including nuclear magnetic resonance (NMR), infrared (IR) spectroscopy, and tandem mass spectrometry (MS/MS). The robust validation of the inferred complexity across a multimodal suite of techniques instills confidence in the objectivity of the complexity scale proposed and the reliability of its resultant measurement.

Read the full article at: pubs.acs.org

An Informational Approach to Emergence

Claudio Gnoli

Volume 29, pages 543–551, (2024)

Emergence can be described as a relationship between entities at different levels of organization, that looks especially puzzling at the transitions between the major levels of matter, life, cognition and culture. Indeed, each major level is dependent on the lower one not just for its constituents, but in some more formal way. A passage by François Jacob suggests that all such evolutionary transitions are associated with the appearance of some form of memory–genetic, neural or linguistic respectively. This implies that they have an informational nature. Based on this idea, we propose a general model of informational systems understood as combinations of modules taken from a limited inventory. Some informational systems are “semantic” models, that is reproduce features of their environment. Among these, some are also “informed”, that is have a pattern derived from a memory subsystem. The levels and components of informed systems can be listed to provide a general framework for knowledge organization, of relevance in both philosophical ontology and applied information services.

Read the full article at: link.springer.com

Editorial: Understanding and engineering cyber-physical collectives

Roberto Casadei, Lukas Esterle, Rose Gamble, Paul Harvey, and Elizabeth F. Wanner

Front. Robot. AI, 06 May 2024

Cyber-physical collectives (CPCs) are systems consisting of groups of interactive computational devices situated in physical space. Their emergence is fostered by recent techno-scientific trends like the Internet of Things (IoT), cyber-physical systems (CPSs), pervasive computing, and swarm robotics. Such systems feature networks of devices that are capable of computation and communication with other devices, as well as sensing, actuation, and physical interaction with their environment. This distributed sensing, processing, and action enables them to address spatially situated problems and provide environment-wide services through their collective intelligence (CI) in a wide range of domains including smart homes, buildings, factories, cities, forests, oceans, and so on. However, the inherent complexity of such systems in terms of heterogeneity, scale, non-linear interaction, and emergent behaviour calls for scientific and engineering ideas, methods, and tools (cf. Wirsing et al. (2023); Dorigo et al. (2021); Brambilla et al. (2013); Casadei (2023a; b)). This Research Topic gathers contributions related to understanding and engineering cyber-physical collectives.

Read the full article at: www.frontiersin.org