Category: Announcements

Philosophies | Special Issue : Philosophy and Epistemology of Deep Learning

Call For Papers:
Deadline for manuscript submissions: 15 March 2019
Guest Editors:
Dr. Hector Zenil
Prof. Dr. Selmer Bringsjord
Current popular approaches to Machine Learning (ML), Deep Learning (DL) and Artificial Intelligence (AI) are mostly statistical in nature, and are not well equipped to deal with abstraction and explanation. In particular, they cannot generate candidate models or make generalizations directly from data to discover possible causal mechanisms. One method that researchers are resorting to in order to discover how deep learning algorithms work involves using what are called ‘generative models’ (a possible misnomer). They train a learning algorithm and handicap it systematically whilst asking it to generate examples. By observing the resulting examples they are able to make inferences about what may be happening in the algorithm at some level. 
However, current trends and methods are widely considered black-box approaches that have worked amazingly well in classification tasks, but provide little to no understanding of causation and are unable to deal with forms of symbolic computation such as logical inference and explanation. As a consequence, they also fail to be scalable in domains they have not been trained for, and require tons of data to be trained on, before they can do anything interesting—-and they require training every time they are presented with (even slightly) different data. 
Furthermore, how other cognitive features, such as human consciousness, may be related to current and future directions in deep learning, and whether such features may prove advantageous or disadvantageous remains an open question.
The aim of this special issue is thus to attempt to ask the right questions and shed some light on the achievements, limitations and future directions in reinforcement/deep learning approaches and differentiable programming. Its particular focus will be on the interplay of data and model-driven approaches that go beyond current ones, which for the most part are  based on traditional statistics. It will attempt to ascertain whether a fundamental theory is needed or whether one already exists, and to explore the implications of current and future technologies based on deep learning and differentiable programming for science, technology and society.

Special issue website:

https://www.mdpi.com/journal/philosophies/special_issues/deep_learning

Source: www.mdpi.com

NECSI Executive Courses:  Integrating Artificial and Human Intelligence & Reshaping Strategy in the Age of Data

Business and society are transforming and becoming increasingly complex. Artificial Intelligence, machine learning, big data analytics and hybrid human-machine systems are playing an increasing role in business products, strategy, and in the organization itself. 

NECSI is hosting two courses as part of its week-long NECSI Executive 2018 Fall Program in Boston, MA. Each course can stand alone, but together they form a potent and practical training experience for the executive leader. 

Source: necsi-exec.org

The Prize in Economic Sciences 2018

At its heart, economics deals with the management of scarce resources. Nature dictates the main constraints on economic growth and our knowledge determines how well we deal with these constraints. This year’s Laureates William Nordhaus and Paul Romer have significantly broadened the scope of economic analysis by constructing models that explain how the market economy interacts with nature and knowledge.

Technological change – Romer demonstrates how know- ledge can function as a driver of long-term economic growth. When annual economic growth of a few per cent accumulates over decades, it transforms people’s lives. Previous macroeconomic research had emphasised technological innovation as the primary driver of economic growth, but had not modelled how economic decisions and market conditions determine the creation of new technologies. Paul Romer solved this problem by demonstrating how economic forces govern the willingness of firms to produce new ideas and innovations.

Romer’s solution, which was published in 1990, laid the foundation of what is now called endogenous growth theory. The theory is both conceptual and practical, as it explains how ideas are different to other goods and require specific conditions to thrive in a market. Romer’s theory has generated vast amounts of new research into the regulations and policies that encourage new ideas and long-term prosperity.

Climate change – Nordhaus’ findings deal with interactions between society and nature. Nordhaus decided to work on this topic in the 1970s, as scientists had become increasingly worried about the combustion of fossil fuel resulting in a warmer climate. In the mid-1990s, he became the first person to create an integrated assessment model, i.e. a quantitative model that describes the global interplay between the economy and the climate. His model integrates theories and empirical results from physics, chemistry and economics. Nordhaus’ model is now widely spread and is used to simulate how the eco- nomy and the climate co-evolve. It is used to examine the consequences of climate policy interventions, for example carbon taxes.

The contributions of Paul Romer and William Nordhaus are methodological, providing us with fundamental insights into the causes and consequences of technological innovation and climate change. This year’s Laureates do not deliver conclusive answers, but their findings have brought us considerably closer to answering the question of how we can achieve sustained and sustainable global economic growth.

Source: www.nobelprize.org

The Nobel Prize in Chemistry 2018

Since the first seeds of life arose around 3.7 billion years ago, almost every crevice on Earth has filled with different organisms. Life has spread to hot springs, deep oceans and dry deserts, all because evolution has solved a number of chemical problems. Life’s chemical tools – proteins – have been optimised, changed and renewed, creating incredible diversity.

This year’s Nobel Laureates in Chemistry have been inspired by the power of evolution and used the same principles – genetic change and selection – to develop proteins that solve mankind’s chemical problems.

One half of this year’s Nobel Prize in Chemistry is awarded to Frances H. Arnold. In 1993, she conducted the first directed evolution of enzymes, which are proteins that catalyse chemical reactions. Since then, she has refined the methods that are now routinely used to develop new catalysts. The uses of Frances Arnold’s enzymes include more environmentally friendly manufacturing of chemical substances, such as pharmaceuticals, and the production of renewable fuels for a greener transport sector.

The other half of this year’s Nobel Prize in Chemistry is shared by George P. Smith and Sir Gregory P. Winter. In 1985, George Smith developed an elegant method known as phage display, where a bacteriophage – a virus that infects bacteria – can be used to evolve new proteins. Gregory Winter used phage display for the directed evolution of antibodies, with the aim of producing new pharmaceuticals. The first one based on this method, adalimumab, was approved in 2002 and is used for rheumatoid arthritis, psoriasis and inflammatory bowel diseases. Since then, phage display has produced anti-bodies that can neutralise toxins, counteract autoimmune diseases and cure metastatic cancer.

We are in the early days of directed evolution’s revolution which, in many different ways, is bringing and will bring the greatest benefit to humankind.

Source: www.nobelprize.org

The 2018 Nobel Prize in Physics

Arthur Ashkin invented optical tweezers that grab particles, atoms, viruses and other living cells with their laser beam fingers. This new tool allowed Ashkin to realise an old dream of science fiction – using the radiation pressure of light to move physical objects. He succeeded in getting laser light to push small particles towards the centre of the beam and to hold them there. Optical tweezers had been invented.

A major breakthrough came in 1987, when Ashkin used the tweezers to capture living bacteria without harming them. He immediately began studying biological systems and optical tweezers are now widely used to investigate the machinery of life.

Gérard Mourou and Donna Strickland paved the way towards the shortest and most intense laser pulses ever created by mankind. Their revolutionary article was published in 1985 and was the foundation of Strickland’s doctoral thesis.

Using an ingenious approach, they succeeded in creating ultrashort high-intensity laser pulses without destroying the amplifying material. First they stretched the laser pulses in time to reduce their peak power, then amplified them, and finally compressed them. If a pulse is compressed in time and becomes shorter, then more light is packed together in the same tiny space – the intensity of the pulse increases dramatically.

Strickland and Mourou’s newly invented technique, called chirped pulse amplification, CPA, soon became standard for subsequent high-intensity lasers. Its uses include the millions of corrective eye surgeries that are conducted every year using the sharpest of laser beams.

The innumerable areas of application have not yet been completely explored. However, even now these celebrated inventions allow us to rummage around in the microworld in the best spirit of Alfred Nobel – for the greatest benefit to humankind.

Source: www.nobelprize.org