Month: October 2017

Rapid rise and decay in petition signing

Contemporary collective action, much of which involves social media and other Internet-based platforms, leaves a digital imprint which may be harvested to better understand the dynamics of mobilization. Petition signing is an example of collective action which has gained in popularity with rising use of social media and provides such data for the whole population of petition signatories for a given platform. This paper tracks the growth curves of all 20,000 petitions to the UK government petitions website (http://epetitions.direct.gov.uk) and 1,800 petitions to the US White House site (https://petitions.whitehouse.gov), analyzing the rate of growth and outreach mechanism. Previous research has suggested the importance of the first day to the ultimate success of a petition, but has not examined early growth within that day, made possible here through hourly resolution in the data. The analysis shows that the vast majority of petitions do not achieve any measure of success; over 99 percent fail to get the 10,000 signatures required for an official response and only 0.1 percent attain the 100,000 required for a parliamentary debate (0.7 percent in the US). We analyze the data through a multiplicative process model framework to explain the heterogeneous growth of signatures at the population level. We define and measure an average outreach factor for petitions and show that it decays very fast (reducing to 0.1% after 10 hours in the UK and 30 hours in the US). After a day or two, a petition’s fate is virtually set. The findings challenge conventional analyses of collective action from economics and political science, where the production function has been assumed to follow an S-shaped curve.

 

Rapid rise and decay in petition signing
Taha YasseriEmail author, Scott A Hale and Helen Z Margetts
EPJ Data Science20176:20
https://doi.org/10.1140/epjds/s13688-017-0116-6

Source: epjdatascience.springeropen.com

The 2017 Nobel Prize in Chemistry

We may soon have detailed images of life’s complex machineries in atomic resolution. The Nobel Prize in Chemistry 2017 is awarded to Jacques Dubochet, Joachim Frank and Richard Henderson for the development of cryo-electron microscopy, which both simplifies and improves the imaging of biomolecules. This method has moved biochemistry into a new era.

A picture is a key to understanding. Scientific breakthroughs often build upon the successful visualisation of objects invisible to the human eye. However, biochemical maps have long been filled with blank spaces because the available technology has had difficulty generating images of much of life’s molecular machinery. Cryo-electron microscopy changes all of this. Researchers can now freeze biomolecules mid-movement and visualise processes they have never previously seen, which is decisive for both the basic understanding of life’s chemistry and for the development of pharmaceuticals.

Electron microscopes were long believed to only be suitable for imaging dead matter, because the powerful electron beam destroys biological material. But in 1990, Richard Henderson succeeded in using an electron microscope to generate a three-dimensional image of a protein at atomic resolution. This breakthrough proved the technology’s potential.

Joachim Frank made the technology generally applicable. Between 1975 and 1986 he developed an image processing method in which the electron microscope’s fuzzy twodimensional images are analysed and merged to reveal a sharp three-dimensional structure.

Jacques Dubochet added water to electron microscopy. Liquid water evaporates in the electron microscope’s vacuum, which makes the biomolecules collapse. In the early 1980s, Dubochet succeeded in vitrifying water – he cooled water so rapidly that it solidified in its liquid form around a biological sample, allowing the biomolecules to retain their natural shape even in a vacuum.

Following these discoveries, the electron microscope’s every nut and bolt have been optimised. The desired atomic resolution was reached in 2013, and researchers can now routinely produce three-dimensional structures of biomolecules. In the past few years, scientific literature has been filled with images of everything from proteins that cause antibiotic resistance, to the surface of the Zika virus. Biochemistry is now facing an explosive development and is all set for an exciting future.

Source: www.nobelprize.org

The origins of intelligent life

In his ambitious book Life Through Time and Space, Wallace Arthur tack­les an extraordinarily difficult set of topics. What is the origin and fate of the universe? How did life, and eventually intelligent life, come into existence on Earth? How does a fertilized human egg trans­form into a complex person with only DNA to guide development?

 

The origins of intelligent life
Marcos Huerta
Life Through Time and Space Wallace Arthur Harvard University Press, 2017. 289 pp.
Science  11 Aug 2017:
Vol. 357, Issue 6351, pp. 556
DOI: 10.1126/science.aao0931

Source: science.sciencemag.org

Complexity of evolutionary equilibria in static fitness landscapes

Experiments show that fitness landscapes can have a rich combinatorial structure due to epistasis and yet theory assumes that local peaks can be reached quickly. I introduce a distinction between easy landscapes where local fitness peaks can be found in a moderate number of steps and hard landscapes where finding evolutionary equilibria requires an infeasible amount of time. Hard examples exist even among landscapes with no reciprocal sign epistasis; on these, strong selection weak mutation dynamics cannot find the unique peak in polynomial time. On hard rugged fitness landscapes, no evolutionary dynamics — even ones that do not follow adaptive paths — can find a local fitness peak quickly; and the fitness advantage of nearby mutants cannot drop off exponentially fast but has to follow a power-law that long term evolution experiments have associated with unbounded growth in fitness. I present candidates for hard landscapes at scales from singles genes, to microbes, to complex organisms with costly learning (Baldwin effect). Even though hard landscapes are static and finite, local evolutionary equilibrium cannot be assumed.

Source: www.biorxiv.org

Reliable uncertainties in indirect measurements

In this article we present very intuitive, easy to follow, yet mathematically rigorous, approach to the so called data fitting process. Rather than minimizing the distance between measured and simulated data points, we prefer to find such an area in searched parameters’ space that generates simulated curve crossing as many acquired experimental points as possible, but at least half of them. Such a task is pretty easy to attack with interval calculations. The problem is, however, that interval calculations operate on guaranteed intervals, that is on pairs of numbers determining minimal and maximal values of measured quantity while in vast majority of cases our measured quantities are expressed rather as a pair of two other numbers: the average value and its standard deviation. Here we propose the combination of interval calculus with basic notions from probability and statistics. This approach makes possible to obtain the results in familiar form as reliable values of searched parameters, their standard deviations, and their correlations as well. There are no assumptions concerning the probability density distributions of experimental values besides the obvious one that their variances are finite. Neither the symmetry of uncertainties of experimental distributions is required (assumed) nor those uncertainties have to be `small.’ As a side effect, outliers are quietly and safely ignored, even if numerous.

 

Reliable uncertainties in indirect measurements
Marek W. Gutowski

Source: arxiv.org