The market setup has a well defined homogeneous rational expectations equilibrium. It is shown that restricting strategies to be long horizon enables convergence to this equilibrium where agents agree on the appropriate pricing function, and trading volume goes to zero. Allowing a broad range of trader horizons to enter the market stops this convergence, and causes the market to converge to something that more closely resembles real markets with high trading volume, price volatility well above what it should be due to fundamentals, fat tailed return distributions, and persistence in both volume and volatility.
Experiments are performed to explore situations that might drive the market to the homogenous equilibrium. Implementing a transaction cost for changing strategies slows down the exploration of the individual agents enough for the market to converge. Also, seeding a market with a large number of long horizon agents initially can cause convergence to the equilibrium which then is relatively impervious to invasions of short horizon traders.
This framework suggests many further explorations both in calibrating time series, and in policy related examinations of the heterogeneous agent populations. Basically, it emphasizes the importance of learning about time and the relevance of how agents use past data in real financial markets.
- Evolution and Time Horizons in an Agent Based Stock Market, Blake LeBaron, Working Paper, Brandeis Univeristy, Rev. 4/00, (Contributed by the author)
Shraiman and Siggia give a review of recent progress of "passive scalar" turbulence i.e. the more simple case where the transported quantity will not influence the flow patterns. Some of the most important examples are the spread of pollutants in air and water. While in non-turbulent spreading of pollutants one would expect that the concentration decreases smoothly with the distance from the source, the situation is much more complex in the case of turbulence: The spreading can be much more rapid and highly non-uniform or "intermittent" in both space and time. That means for instance that radioactive particles from a reactor accident could arrive at a distant location at a very short time and at a concentration that is much higher than what one would expect from simple, uniform diffusion assumptions.
The researchers point out that theoretical progress for scalar turbulence was possible even without the corresponding in progress in understanding the turbulent velocity field. One could show that probability distributions of concentration values in scalar turbulence have similar properties -such as "fat tails"- to those observed in financial markets.
Furthermore mathematical progress could be made by shifting from an Eulerian viewpoint (observation from the laboratory as reference system) to an Lagrangian perspective (the observation point moves along with the flow). Also it turned out that it was important to go beyond that traditional description of pair-wise correlations to "multi-point correlators".
These new theoretical insights together with the steady increase in available computational power give rise to hope that the understanding of turbulence fluid-flow will dramatically increase during this century even if the turbulence problem will not be "solved" in a traditional sense.
- Scalar Turbulence , Boris I. Shraiman And Eric D. Siggia, Nature 405, 639 - 646 (2000)
Abstract: An effective environmental force constant is introduced to quantify the molecular resilience (or its opposite, "softness") of a protein structure and relate it to biological function and activity. Specific resilience-function relations were found in neutron-scattering experiments on purple membranes containing bacteriorhodopsin, the light-activated proton pump of halobacteria; the connection between resilience and stability is illustrated by a study of myoglobin in different environments. Important advantages of the neutron method are that it can characterize the dynamics of any type of biological sample--which need not be crystalline or monodisperse--and that it enables researchers to focus on the dynamics of specific parts of a complex structure with deuterium labeling.
- How Soft Is a Protein? A Protein Dynamics Force Constant Measured by Neutron Scattering, G. Zaccai, Science 2000 June 2; 288(5471): p. 1604-1607
This claim has been made for stationary billiards, and as quantum-well nanostructures do experience vibrations, perhaps this can help claim them a bit. The radially vibrating spherical billiard, for example, models quantum dots that are termed 'ballistic' to describe the Dirichlet boundary conditions. Despite this, the major import of this project is theoretical---as is true for nearly all studies of toy models.
Now they report a careful refinement of their original experiment where they produce composite patterns of lines with variable transparency. The result of this arrangement is that the apparent visual pattern can be continuously tuned so that the lines seem to belong to two different surfaces or to the same identical surface. This arrangement is important because it allows to keep the average activity of the neurons constant while the visual patterns make a transition between the two different perceptions. This observation rules out that neuronal activity or firing rate actually encodes the information about the nature of the observed surfaces. On the other hand, the synchronization between neurons that respond to the two different stripe patterns is only present when the transparency is tune so that a coherent surface is perceived. This clever experimental arrangement therefore provides further evidence that the timing of the neuronal spikes does indeed carry important information.
- Neural Synchrony Correlates With Surface Segregation Rules, Miguel Castelo-Branco, Rainer Goebel, Sergio Neuenschwander & Wolf Singer, Nature 405, 685 - 689 (2000)
Though much more analysis remains, an initial look at Hubble evidence favors the idea that titanic black holes did not precede a galaxy's birth but instead evolved with the galaxy by trapping an amazingly exact percentage (0.2 percent) of the mass of the bulbous hub of stars and gas in a galaxy.
This means that black holes in small galaxies went relatively undernourished, weighing in at a mere few million solar masses. Black holes in the centers of giant galaxies, some tipping the scale at over one billion solar masses, were so engorged with infalling gas they once blazed as quasars, the brightest objects in the cosmos.
The bottom line is that the final mass of a black hole is not primordial; it is determined during the galaxy formation process. "This supports the original theory of why black holes are important and how they got their masses. It suggests that the major events that made a galaxy and the ones that made its black hole shine as a quasar were the same events," says John Kormendy of the University of Texas at Austin. "These results are a catalyst that helps to tie together many lines of investigation." (…)
The results also explain why galaxies with small bulges, like our Milky Way, have diminutive central black holes of a few million solar masses, while giant elliptical galaxies house billion-solar-mass black holes, some still smoldering from their days as quasars. Disk galaxies without a central bulge of stars either have no black hole or have only tiny black holes that are well below Hubble's detection limit.
The findings are based on two types of Hubble observations. Several teams measured the black holes' masses by recording the whirling speeds of disks of gas trapped around the black holes, like water swirling around a drain. Other teams measured the motions of stars around the galaxies' hubs like a swarm of bees hovering around a beehive. The more massive the bulge, the greater the speed of the stars. (…) "
Self-replication of molecular systems is often viewed in the context of information content. Many scientists believe that life began with the spontaneous emergence of biopolymers, such as proteins or RNA, where information is stored in the sequence of chemical units. Experiments mimicking the conditions on Earth billions of years ago have shown how such chemical units, e.g. some of the building blocks of proteins and RNA, could appear spontaneously. Yet, the emergence of proteins or self-replicating RNA molecules remained enigmatic.
This started Prof. Doron Lancet of the Crown Human Genome Center in the Weizmann Institute of Science, and his students, Daniel Segre and Dafna Ben-Eli, on a journey leading to alternatives to proteins and RNA. They have developed a model, suggesting a new route for the origin of life, based on lipid molecules. This model is described in an article published in a recent issue of the Proceedings of the National Academy of Science, USA. (…)
The model proposed by Lancet and colleagues offers a solution. They surmise that early on, lipid-like compounds existed in a very large diversity of shapes and forms. They show mathematically that under such conditions, lipid assemblies could contain almost as much information as an RNA strand or a protein chain. Information would be stored in the assembly's composition, i.e. in the exact amount of each of its compounds, rather than in a sequence of molecular "beads" on a string. (…)
Thus, the authors argue, heterogeneous lipid assemblies may be thought of as having a "compositional genome". They further demonstrate how a droplet-like lipid assembly, when growing and splitting, could manifest a form of inheritance. Their computer simulations show how a compositional genome would be handed down with some fidelity to the offspring assemblies. A crucial aspect of the model is how such molecular inheritance is made possible. In present-day cells, the replication of information-containing DNA is facilitated by protein enzyme catalysts. In the early prebiological era, catalysis could be performed by the same lipid-like substances that carry the information. Molecules already present inside a droplet would function as a molecular selection committee, enhancing the rate of entry for some, and rejecting others. (…)"
- New Theory On The Mystery Of The Origin Of Life Proposed By Weizmann Institute Scientists, Science Daily, Posted 6/8/2000
- News Release Weizmann Institute
The power of linkage analysis to locate susceptibility loci for complex diseases is much greater in animal models than in humans, for a variety of reasons. First, inbred strains are used, which tends to reduce the genetic complexity and limit genetic effects to the loci that differentiate the original strains used in the breeding experiments. Second, by design, all matings are informative (e.g., all parents are heterozygous in an intercross). Third, all matings have the same phase (i.e., the genotypes at a presumed disease locus are known), and offspring from all matings can be combined into a single analysis. (...)
- Searching For Genes In Complex Diseases: Lessons From Systemic Lupus Erythematosus, Neil Risch, Clin Invest, June 2000, Volume 105, Number 11, 1503-1506
Abstract: Entropy, as it relates to dynamical systems, is the rate of information production. Methods for estimation of the entropy of a system represented by a time series are not, however, well suited to analysis of the short and noisy data sets encountered in cardiovascular and other biological studies. Pincus introduced approximate entropy (ApEn), a set of measures of system complexity closely related to entropy, which is easily applied to clinical cardiovascular and other time series. ApEn statistics, however, lead to inconsistent results. We have developed a new and related complexity measure, sample entropy (SampEn), and have compared ApEn and SampEn by using them to analyze sets of random numbers with known probabilistic character. We have also evaluated cross-ApEn and cross-SampEn, which use cardiovascular data sets to measure the similarity of two distinct time series. SampEn agreed with theory much more closely than ApEn over a broad range of conditions. The improved accuracy of SampEn statistics should make them useful in the study of experimental clinical cardiovascular and other biological time series.
- Physiological Time-Series Analysis Using Approximate Entropy And Sample Entropy, J. S. Richman, J. R. Moorman, AJP Heart Online Vol. 278, Issue 6, H2039-H2049, June 2000
- Entropies Of Short Binary Sequences In Heart Period Dynamics, D. Cysarz, H. Bettermann, P. van Leeuwen, AJP Heart Online Vol. 278, Issue 6, H2163-H2172, June 2000
Excerpt: The record of Archaean microfossils is sparse. (…) Here, I report the discovery of pyritic filaments, the probable fossil remains of thread-like microorganisms, in a 3,235-million-year-old deep-sea volcanogenic massive sulphide deposit from the Pilbara Craton of Australia. (…) They represent the first fossil evidence for microbial life in a Precambrian submarine thermal spring system, and extend the known range of submarine hydrothermal biota by more than 2,700 million years. Such environments may have hosted the first living systems on Earth, consistent with proposals for a thermophilic origin of life.
- Filamentous Microfossils In A 3,235-Million-Year-Old Volcanogenic Massive Sulphide Deposit, Birger Rasmussen, Nature 405, 676 - 679 (2000)
Excerpt: Here we assess whether reactions whose rates are affected by the orientation of reactants in magnetic fields could form the basis of a biological compass. We use a general model, incorporating biological components and design criteria, to calculate realistic constraints for such a compass. This model compares a chemical signal produced owing to magnetic field effects with stochastic noise and with changes due to physiological temperature variation. Our analysis shows that a chemically based biological compass is feasible with its size, for any given detection limit, being dependent on the magnetic sensitivity of the rate constant of the chemical reaction.
Note: I have asked some experts in animal navigation and they claim there is no experimental evidence that this mechanism is actually used by animals. (gmk)
- Biological Sensing Of Small Field Differences By Magnetically Sensitive Chemical Reactions, J.C. Weaver, T.E. Vaughan, R. D. Astumian, Nature 405, 707 - 709 (2000)
Excerpt: What is it that makes humans unique among animals? Most philosophers point to language. Others are pinpointing music as another means of communication. When paired with lyrics, music is one of the most powerful and emotional means of communicating. So, what is music? How does music interact with our brain? In other words, what are the biological correlates of music? And what are its broader implications for understanding brain structure? Unlike other complex, rule-governed activities such as language, why is musical proficiency heavily influenced by tutoring? (...)
The conference will cover such topics as: Musical Predispositions in Infancy; Universals of Temporal Processing; Music, Cognition, Culture and Evolution; The Brain of Musicians: A Model for Functional and Structural Adaptation; Similarity, Invariance and Musical Variation; Tonal Cognition; Tonality and the Brain; Cerebral Substrates for Musical Temporal Processes; Neural Specialization for Tonal Processing; Functional Neuroanatomy of Musical Listening, Discrimination, and Performance; and Rhythm and Contour in Music and Poetry.