Month: March 2019

Postdoctoral Fellowships at the Centro de Ciencias de la Complejidad (C3), UNAM

The C3-UNAM announces that each year there will be 2 periods, April-May and December-January, that applications will be received for 2 postdoctoral grants from the UNAM to realize research at the C3-UNAM, starting in September and March, respectively (4 postdoc grants yearly). The purpose of the grants is to realize research in complexity science in one of the following areas: computational intelligence and mathematical modeling, complexity and health, neurosciences, ecological complexity and environment (postdoctoral grants for research in humanistic sciences such as social complexity, and arts, science and complexity will be announced separately), please find the academic programs that are developed at the C3-UNAM in the page:
https://www.c3.unam.mx/progacademicos.html
Technical details for the application are explained in the page:
http://dgapa.unam.mx/images/posdoc/2019_posdoc_convocatoria.pdf
The grants are for 1 year and renewable for a 2nd year in function of the results obtained.

Source: www.c3.unam.mx

An Exact No Free Lunch Theorem for Community Detection

A precondition for a No Free Lunch theorem is evaluation with a loss function which does not assume a priori superiority of some outputs over others. A previous result for community detection by Peel et al. (2017) relies on a mismatch between the loss function and the problem domain. The loss function computes an expectation over only a subset of the universe of possible outputs; thus, it is only asymptotically appropriate with respect to the problem size. By using the correct random model for the problem domain, we provide a stronger, exact No Free Lunch theorem for community detection. The claim generalizes to other set-partitioning tasks including core/periphery separation, k-clustering, and graph partitioning. Finally, we review the literature of proposed evaluation functions and identify functions which (perhaps with slight modifications) are compatible with an exact No Free Lunch theorem.

 

An Exact No Free Lunch Theorem for Community Detection
Arya D. McCarthy, Tongfei Chen, Seth Ebner

Source: arxiv.org

How to Make Swarms Open-Ended? Evolving Collective Intelligence Through a Constricted Exploration of Adjacent Possibles

We propose an approach of open-ended evolution via the simulation of swarm dynamics. In nature, swarms possess remarkable properties, which allow many organisms, from swarming bacteria to ants and flocking birds, to form higher-order structures that enhance their behavior as a group. Swarm simulations highlight three important factors to create novelty and diversity: (a) communication generates combinatorial cooperative dynamics, (b) concurrency allows for separation of timescales, and (c) complexity and size increases push the system towards transitions in innovation. We illustrate these three components in a model computing the continuous evolution of a swarm of agents. The results, divided in three distinct applications, show how emergent structures are capable of filtering information through the bottleneck of their memory, to produce meaningful novelty and diversity within their simulated environment.

 

How to Make Swarms Open-Ended? Evolving Collective Intelligence Through a Constricted Exploration of Adjacent Possibles
Olaf Witkowski, Takashi Ikegami

Source: arxiv.org

A high-bias, low-variance introduction to Machine Learning for physicists

Machine Learning (ML) is one of the most exciting and dynamic areas of modern research and application. The purpose of this review is to provide an introduction to the core concepts and tools of machine learning in a manner easily understood and intuitive to physicists. The review begins by covering fundamental concepts in ML and modern statistics such as the bias–variance tradeoff, overfitting, regularization, generalization, and gradient descent before moving on to more advanced topics in both supervised and unsupervised learning. Topics covered in the review include ensemble models, deep learning and neural networks, clustering and data visualization, energy-based models (including MaxEnt models and Restricted Boltzmann Machines), and variational methods. Throughout, we emphasize the many natural connections between ML and statistical physics. A notable aspect of the review is the use of Python Jupyter notebooks to introduce modern ML/statistical packages to readers using physics-inspired datasets (the Ising Model and Monte-Carlo simulations of supersymmetric decays of proton–proton collisions). We conclude with an extended outlook discussing possible uses of machine learning for furthering our understanding of the physical world as well as open problems in ML where physicists may be able to contribute.

 

 

A high-bias, low-variance introduction to Machine Learning for physicists

Pankaj Mehta, et al.

Physics Reports

Source: www.sciencedirect.com