Month: May 2017

Entropy┬áSpecial Issue “Information Decomposition of Target Effects from Multi-Source Interactions”

Shannon information theory has provided rigorous ways to capture our intuitive notions regarding uncertainty and information, and made an enormous impact in doing so. One of the fundamental measures here is mutual information, which captures the average information contained in one variable about another, and vice versa. If we have two source variables and a target, for example, we can measure the information held by one source about the target, the information held by the other source about the target, and the information held by those sources together about the target. Any other notion about the directed information relationship between these variables, which can be captured by classical information-theoretic measures (e.g., conditional mutual information terms) is linearly redundant with those three quantities.

However, intuitively, there is strong desire to measure further notions of how this directed information interaction may be decomposed, e.g., how much information the two source variables hold redundantly about the target, how much each source variable holds uniquely, and how much information can only be discerned by synergistically examining the two sources together. These notions go beyond the traditional information-theoretic view of a channel serving the purpose of reliable communication, considering now the situation of multiple communication streams converging on a single target. This is a common situation in biology, and in particular in neuroscience, where, say, the ability of a target to synergistically fuse multiple information sources in a non-trivial fashion is likely to have its own intrinsic value, independently of reliability of communication.

The absence of measures for such decompositions into redundant, unique and synergistic information is arguably the most fundamental missing piece in classical information theory. Triggered by the formulation of the Partial Information Decomposition framework by Williams and Beer in 2010, the past few years have witnessed a concentration of work by the community in proposing, contrasting, and investigating new measures to capture these notions of information decomposition. Other theoretical developments consider how these measures relate to concepts of information processing in terms of storage, transfer and modification. Meanwhile, computational neuroscience has emerged as a primary application area due to significant interest in questions surrounding how target neurons integrate information from large numbers of sources, as well as the availability of data sets to investigate these questions on.

This Special Issue seeks to bring together these efforts, to capture a snapshot of the current research, as well as to provide impetus for and focused scrutiny on newer work. We also seek to present progress to the wider community and attract further research. We welcome research articles proposing new measures or pointing out future directions, review articles on existing approaches, commentary on properties and limitations of such approaches, philosophical contributions on how such measures may be used or interpreted, applications to empirical data (e.g., neural imaging data), and more.

Dr. Joseph Lizier
Dr. Nils Bertschinger
Prof. Michael Wibral
Prof. Juergen Jost
Guest Editors

Source: www.mdpi.com

The missing links: A global study on uncovering financial network structures from partial data

Capturing financial network linkages and contagion in stress test models are important goals for banking supervisors and central banks responsible for micro- and macroprudential policy. However, granular data on financial networks is often lacking, and instead the networks must be reconstructed from partial data. In this paper, we conduct a horse race of network reconstruction methods using network data obtained from 25 different markets spanning 13 jurisdictions. Our contribution is two-fold: first, we collate and analyze data on a wide range of financial networks. And second, we rank the methods in terms of their ability to reconstruct the structures of links and exposures in networks.

 

The missing links: A global study on uncovering financial network structures from partial data

Kartik Anand, et al.

Journal of Financial Stability

Source: www.sciencedirect.com

Statistical physics of human cooperation

Extensive cooperation among unrelated individuals is unique to humans, who often sacrifice personal benefits for the common good and work together to achieve what they are unable to execute alone. The evolutionary success of our species is indeed due, to a large degree, to our unparalleled other-regarding abilities. Yet, a comprehensive understanding of human cooperation remains a formidable challenge. Recent research in social science indicates that it is important to focus on the collective behavior that emerges as the result of the interactions among individuals, groups, and even societies. Non-equilibrium statistical physics, in particular Monte Carlo methods and the theory of collective behavior of interacting particles near phase transition points, has proven to be very valuable for understanding counterintuitive evolutionary outcomes. By studying models of human cooperation as classical spin models, a physicist can draw on familiar settings from statistical physics. However, unlike pairwise interactions among particles that typically govern solid-state physics systems, interactions among humans often involve group interactions, and they also involve a larger number of possible states even for the most simplified description of reality. The complexity of solutions therefore often surpasses that observed in physical systems. Here we review experimental and theoretical research that advances our understanding of human cooperation, focusing on spatial pattern formation, on the spatiotemporal dynamics of observed solutions, and on self-organization that may either promote or hinder socially favorable states.

 

Statistical physics of human cooperation
Matjaz Perc, Jillian J. Jordan, David G. Rand, Zhen Wang, Stefano Boccaletti, Attila Szolnoki

Source: arxiv.org

Origins of Life: A Problem for Physics

The origins of life stands among the great open scientific questions of our time. While a number of proposals exist for possible starting points in the pathway from non-living to living matter, these have so far not achieved states of complexity that are anywhere near that of even the simplest living systems. A key challenge is identifying the properties of living matter that might distinguish living and non-living physical systems such that we might build new life in the lab. This review is geared towards covering major viewpoints on the origin of life for those new to the origin of life field, with a forward look towards considering what it might take for a physical theory that universally explains the phenomenon of life to arise from the seemingly disconnected array of ideas proposed thus far. The hope is that a theory akin to our other theories in fundamental physics might one day emerge to explain the phenomenon of life, and in turn finally permit solving its origins.

 

Origins of Life: A Problem for Physics
Sara I. Walker

Source: arxiv.org

Predicting stock market movements using network science: An information theoretic approach

A stock market is considered as one of the highly complex systems, which consists of many components whose prices move up and down without having a clear pattern. The complex nature of a stock market challenges us on making a reliable prediction of its future movements. In this paper, we aim at building a new method to forecast the future movements of Standard & Poor’s 500 Index (S&P 500) by constructing time-series complex networks of S&P 500 underlying companies by connecting them with links whose weights are given by the mutual information of 60-minute price movements of the pairs of the companies with the consecutive 5,340 minutes price records. We showed that the changes in the strength distributions of the networks provide an important information on the network’s future movements. We built several metrics using the strength distributions and network measurements such as centrality, and we combined the best two predictors by performing a linear combination. We found that the combined predictor and the changes in S&P 500 show a quadratic relationship, and it allows us to predict the amplitude of the one step future change in S&P 500. The result showed significant fluctuations in S&P 500 Index when the combined predictor was high. In terms of making the actual index predictions, we built ARIMA models. We found that adding the network measurements into the ARIMA models improves the model accuracy. These findings are useful for financial market policy makers as an indicator based on which they can interfere with the markets before the markets make a drastic change, and for quantitative investors to improve their forecasting models.

 

Predicting stock market movements using network science: An information theoretic approach

Minjun Kim, Hiroki Sayama

Source: arxiv.org