Month: October 2018

Philosophies | Special Issue : Philosophy and Epistemology of Deep Learning

Call For Papers:
Deadline for manuscript submissions: 15 March 2019
Guest Editors:
Dr. Hector Zenil
Prof. Dr. Selmer Bringsjord
Current popular approaches to Machine Learning (ML), Deep Learning (DL) and Artificial Intelligence (AI) are mostly statistical in nature, and are not well equipped to deal with abstraction and explanation. In particular, they cannot generate candidate models or make generalizations directly from data to discover possible causal mechanisms. One method that researchers are resorting to in order to discover how deep learning algorithms work involves using what are called ‘generative models’ (a possible misnomer). They train a learning algorithm and handicap it systematically whilst asking it to generate examples. By observing the resulting examples they are able to make inferences about what may be happening in the algorithm at some level. 
However, current trends and methods are widely considered black-box approaches that have worked amazingly well in classification tasks, but provide little to no understanding of causation and are unable to deal with forms of symbolic computation such as logical inference and explanation. As a consequence, they also fail to be scalable in domains they have not been trained for, and require tons of data to be trained on, before they can do anything interesting—-and they require training every time they are presented with (even slightly) different data. 
Furthermore, how other cognitive features, such as human consciousness, may be related to current and future directions in deep learning, and whether such features may prove advantageous or disadvantageous remains an open question.
The aim of this special issue is thus to attempt to ask the right questions and shed some light on the achievements, limitations and future directions in reinforcement/deep learning approaches and differentiable programming. Its particular focus will be on the interplay of data and model-driven approaches that go beyond current ones, which for the most part are  based on traditional statistics. It will attempt to ascertain whether a fundamental theory is needed or whether one already exists, and to explore the implications of current and future technologies based on deep learning and differentiable programming for science, technology and society.

Special issue website:

https://www.mdpi.com/journal/philosophies/special_issues/deep_learning

Source: www.mdpi.com

NECSI Executive Courses:  Integrating Artificial and Human Intelligence & Reshaping Strategy in the Age of Data

Business and society are transforming and becoming increasingly complex. Artificial Intelligence, machine learning, big data analytics and hybrid human-machine systems are playing an increasing role in business products, strategy, and in the organization itself. 

NECSI is hosting two courses as part of its week-long NECSI Executive 2018 Fall Program in Boston, MA. Each course can stand alone, but together they form a potent and practical training experience for the executive leader. 

Source: necsi-exec.org

The Prize in Economic Sciences 2018

At its heart, economics deals with the management of scarce resources. Nature dictates the main constraints on economic growth and our knowledge determines how well we deal with these constraints. This year’s Laureates William Nordhaus and Paul Romer have significantly broadened the scope of economic analysis by constructing models that explain how the market economy interacts with nature and knowledge.

Technological change – Romer demonstrates how know- ledge can function as a driver of long-term economic growth. When annual economic growth of a few per cent accumulates over decades, it transforms people’s lives. Previous macroeconomic research had emphasised technological innovation as the primary driver of economic growth, but had not modelled how economic decisions and market conditions determine the creation of new technologies. Paul Romer solved this problem by demonstrating how economic forces govern the willingness of firms to produce new ideas and innovations.

Romer’s solution, which was published in 1990, laid the foundation of what is now called endogenous growth theory. The theory is both conceptual and practical, as it explains how ideas are different to other goods and require specific conditions to thrive in a market. Romer’s theory has generated vast amounts of new research into the regulations and policies that encourage new ideas and long-term prosperity.

Climate change – Nordhaus’ findings deal with interactions between society and nature. Nordhaus decided to work on this topic in the 1970s, as scientists had become increasingly worried about the combustion of fossil fuel resulting in a warmer climate. In the mid-1990s, he became the first person to create an integrated assessment model, i.e. a quantitative model that describes the global interplay between the economy and the climate. His model integrates theories and empirical results from physics, chemistry and economics. Nordhaus’ model is now widely spread and is used to simulate how the eco- nomy and the climate co-evolve. It is used to examine the consequences of climate policy interventions, for example carbon taxes.

The contributions of Paul Romer and William Nordhaus are methodological, providing us with fundamental insights into the causes and consequences of technological innovation and climate change. This year’s Laureates do not deliver conclusive answers, but their findings have brought us considerably closer to answering the question of how we can achieve sustained and sustainable global economic growth.

Source: www.nobelprize.org

Social style and resilience of macaques’ networks, a theoretical investigation

Group-living animals rely on efficient transmission of information for optimal exploitation of their habitat. How efficient and resilient a network is depend on its structure, which is a consequence of the social interactions of the individuals that comprise the network. In macaques, network structure differs according to dominance style. Networks of intolerant species are more modular, more centralized, and less connected than those of tolerant ones. Given these structural differences, networks of intolerant species are potentially more vulnerable to fragmentation and decreased information transmission when central individuals disappear. Here we studied network resilience and efficiency in artificial societies of macaques. The networks were produced with an individual-based model that has been shown to reproduce the structural features of networks of tolerant and intolerant macaques. To study network resilience, we deleted either central individuals or individuals at random and studied the effects of these deletions on network cohesiveness and efficiency. The deletion of central individuals had more negative effects than random deletions from the networks of both tolerant and intolerant artificial societies. Central individuals thus appeared to aid in the maintenance of network cohesiveness and efficiency. Further, the networks of both intolerant and tolerant societies appeared to be robust to the loss of individuals, as network fragmentation was never observed. Our results suggest that despite differences in network structure, networks of tolerant and intolerant macaques may be equally resilient.

 

Social style and resilience of macaques’ networks, a theoretical investigation
Ivan Puga-Gonzalez, Sebastian Sosa, Cedric Sueur

Primates

Source: link.springer.com

Cross-boundary Behavioural Reprogrammability Reveals Evidence of Pervasive Turing-Universality

New paper sheds light on the pervasiveness of Turing universality by showing a series of behavioural boundary crossing results, including emulations (for all initial conditions) of Wolfram class 2 Elementary Cellular Automata (ECA) by Class 1 ECA, emulations of Classes 1, 2 and 3 ECA by Class 2 and 3 ECA, and of Classes 1, 2 and 3 by Class 3 ECA, along with results of even greater emulability for general CA (neighbourhood r = 3/2), including Class 1 CA emulating Classes 2 and 3, and Classes 3 and 4 emulating all other classes (1, 2, 3 and 4). The emulations occur with only a linear overhead and can be considered computationally efficient. The paper also introduces a concept of emulation networks, deriving a topologically-based measure of complexity based upon out- and in-degree connectivity establishing bridges to fundamental ideas of complexity, universality, causality and dynamical systems.

Source: www.oldcitypublishing.com