Category: Announcements

YRNCS Job Fair at CCS’17

Have you got an open position in your group that you’d like to advertise? Are you a young researcher looking for career opportunities?

The YRNCS Job Fair will provide PhD students and early career researchers with a great opportunity to find out about open positions during CCS 2017. It will take place during the Welcome Cocktail reception on Monday 18th September from 7pm onwards, and flyers and posters to advertise the positions will be visible all week. The Job Fair will offer a great chance to meet potential employers and employees, or even just to mingle and make new contacts!

If you’d like to advertise a position, send a one-page flyer at f.botta@warwick.ac.uk

Source: yrncs.cssociety.org

ITMO University Professorship Program

This program is aimed at strengthening the internationalization of the educational experience for scholars, students and the University.

Professorship extends to a variety of fields (link to chairs) and is open to highly qualified professionals who hold a doctorate degree and are affiliated with World’s Top Universities.

From lecturing the curriculum of double degree programs to presenting short-term courses, there’s variety of opportunities to contribute to topical expertise and cutting-edge teaching methods.

International professors can expect a student-oriented learning environment with an emphasis on real-world, global experience. They will also enjoy personal attention of ITMO University’s Foreign Students and Scholars Office that not only will help professors and their families to smoothly relocate, but also make the best out of their time in St. Petersburg.

Source: fellowship.ifmo.ru

ITMO University Fellowship Program

ITMO University Fellowship program aims to provide outstanding researchers and scientists, who are, or have the potential to become, leaders in their chosen fields, with the opportunity to build an independent research career. Our intention is to help to develop next generation of researchers with the greatest potential in their postdoctoral and early career stages.

Source: fellowship.ifmo.ru

Entropy¬†Special Issue “Information Decomposition of Target Effects from Multi-Source Interactions”

Shannon information theory has provided rigorous ways to capture our intuitive notions regarding uncertainty and information, and made an enormous impact in doing so. One of the fundamental measures here is mutual information, which captures the average information contained in one variable about another, and vice versa. If we have two source variables and a target, for example, we can measure the information held by one source about the target, the information held by the other source about the target, and the information held by those sources together about the target. Any other notion about the directed information relationship between these variables, which can be captured by classical information-theoretic measures (e.g., conditional mutual information terms) is linearly redundant with those three quantities.

However, intuitively, there is strong desire to measure further notions of how this directed information interaction may be decomposed, e.g., how much information the two source variables hold redundantly about the target, how much each source variable holds uniquely, and how much information can only be discerned by synergistically examining the two sources together. These notions go beyond the traditional information-theoretic view of a channel serving the purpose of reliable communication, considering now the situation of multiple communication streams converging on a single target. This is a common situation in biology, and in particular in neuroscience, where, say, the ability of a target to synergistically fuse multiple information sources in a non-trivial fashion is likely to have its own intrinsic value, independently of reliability of communication.

The absence of measures for such decompositions into redundant, unique and synergistic information is arguably the most fundamental missing piece in classical information theory. Triggered by the formulation of the Partial Information Decomposition framework by Williams and Beer in 2010, the past few years have witnessed a concentration of work by the community in proposing, contrasting, and investigating new measures to capture these notions of information decomposition. Other theoretical developments consider how these measures relate to concepts of information processing in terms of storage, transfer and modification. Meanwhile, computational neuroscience has emerged as a primary application area due to significant interest in questions surrounding how target neurons integrate information from large numbers of sources, as well as the availability of data sets to investigate these questions on.

This Special Issue seeks to bring together these efforts, to capture a snapshot of the current research, as well as to provide impetus for and focused scrutiny on newer work. We also seek to present progress to the wider community and attract further research. We welcome research articles proposing new measures or pointing out future directions, review articles on existing approaches, commentary on properties and limitations of such approaches, philosophical contributions on how such measures may be used or interpreted, applications to empirical data (e.g., neural imaging data), and more.

Dr. Joseph Lizier
Dr. Nils Bertschinger
Prof. Michael Wibral
Prof. Juergen Jost
Guest Editors

Source: www.mdpi.com