Attention Strengthens Neuronal Synchronization, Nature
The view of the brain as a computer is currently being
superseded by the understanding of the brain as a complex adaptive
system. Part of that transition is triggered by a new
understanding of how information is encoded and modified in the
brain. Individual neurons are not just on-off switches or
biological transistors ("wet-ware").
It has been known for a while that some aspects of the
information is encoded in the rate at which a neuron fires. With
the help of micro-electrodes connected to a loudspeaker
researchers were basically poking around in the brain looking for
neurons that would signal a response to a specific stimulus by a
clearly audible increase in their firing rate. More recently
Freeman, Braitenberg, and others introduced the concept that
synchronous activity of "cell-assemblies" contribute to the signal
Now Steinmetz et al. could demonstrate the contribution of
synchronized neuronal firing to the process of paying attention to
specific aspects of the environment. They recorded the firing of
several neurons (separated between 0.5 and 4 millimeters) in the
secondary somatosensory cortex of monkeys who were trained to
shift attention between visual and tactile tasks. As one would
expect from a cell-assembly perspective they found that during the
tactile task not only the firing rate of the neurons increased but
also their degree of synchrony increased significantly beyond the
level that one would expect from the increased firing rate alone.
The authors did not discuss the mechanisms by which such a
sharp synchronization of just a few milliseconds could take
place. A simple explanation of the temporal coincidences of neuron
firing is the formation of spatio-temporal excitation waves. Those
waves show up universally in neuronal arrays even in the presence
of strong noise perturbations. The figure shows an example of an
artificial neural network where it displays spatio-temporal chaos.
The amount of information that can be stored in those pattern
sequences is huge. But even in strong spatio-temporal chaos,
temporal synchronization between distant neurons would be observed
with a high probability.
After a "Decade of the Brain" one can now observe an exciting
convergence between experiments and theoretical models.
Modulates Synchronized Neuronal Firing In Primate
P. N. Steinmetz, A. Roy, P. J. Fitzgerald, S. S.
Hsiao, K. O. Johnson & E. Niebur , Nature 404, 187
- 190 (2000)
A Chorus Line, Emilio
Salinas & Ranulfo Romo , Nature 404, 131 - 133
- See also: Spatio-Temporal
Stochastic Resonance in Excitable
Media, P. Jung, G.
Mayer-Kress, Phys. Rev. Let., 74(11),
2130-2133, 13 March 1995
How Do Brains of Children Grow?, Nature
How does a complex organ like a brain grow? That is a
very difficult question since the growth is certainly not a
uniform change in size but also involves many developmental
changes. Thompson et al. used state of the art computational
tools fro continuum mechanics that allowed them to track in
considerable detail the dramatic changes that a human brain
undergoes from the age of three to almost grown-up age of
fifteen. The basic strategy of the researchers involved "tensor"
mapping. A tensor is a mathematical object like a point and a
vector. While a point only has a position, a vector additionally
has a direction attached to a point. Analogously a tensor attaches
each point in the developing brain independent directions of
change and growth that will be tracked on a computer.
The researchers obtained their data from repeated magnetic
resonance imaging (MRI) that provides a three-dimensional,
high-resolution image of the brain during different developmental
stages. They found that one of the most striking features is a
growth in the structure that connects the two brain hemispheres
(corpus callosum). It does not grow uniformly but starts growing
in the front and then the center of growth moves more towards the
back of the head. Brain areas that are related to association and
language slow down in their growth after puberty. During the same
time the amount of gray matter deep inside the brain is reduced.
At young ages (3-6years) fastest growth takes place in areas
corresponding to planning and decision making.
Bio-Chemistry Of Learning and Unlearning, PNAS
It is now a widely accepted fact that learning takes
place by changing synaptic connections between those neurons that
are simultaneously active during the process that is to be
learned. A common misunderstanding is that the change in the
connection strength is always positive in the sense of long-term
potentiation ("LTP"). But what is often forgotten is that synaptic
modification can also take place in the form of long term
depression ("LTD") i.e. neurons that were firing together in the
past are less likely to trigger mutual firing in the future.
Sometimes LTD is also referred to as "unlearning" a process that
could be shown in theoretical models to greatly enhance the memory
of a neural network for instance by unlearning all structures that
are not relevant in the long term and therefore better forgotten.
It is a fascinating indicator of the complexity of brain
functioning that the same synapse of the same neuron can sometimes
"learn" (LTP) and sometimes "forget" (LTD) depending on the
presence or absence of neuromodulators. These are basically three
substances (noradrenaline, acetylcholine, serotonin) that act as
"enabling factors" that can speed up either LTP or LTD. These
neuromodulators, are not simply "potentiators" or "depressors",
they can play both roles depending on some conditions: Kirkwood
reports that for the case of neurons in the cat visual cortex
short, high frequency stimulations cause LTP whereas prolonged low
frequency stimulation leads to LTD. At least that is the simple
rule for two of the enabling factors. But for serotonin the story
is a little more complex: it seemed to facilitate both LTP and LTD
in an unpredictable fashion. A solution to this mystery was found
with the discovery that receptors for serotonin are not uniformly
distributed but come in patches. In some patches it acts with low
frequency LTP in other patches with the opposite effect. One could
speculate if this could implement some form of "firmware"
The new insights into the complexity of the bio-chemical
conditions under which learning /unlearning takes place might also
shed light on the question why we sleep: Only during the awake and
alert state are all three neuromodulators active and "(…) a
given pattern of input activity might weaken synapses when only
one neuromodulatory system is on, but strengthen them when the
three systems are active simultaneously". Thus it might be
possible that random events that we experience during the waking
hours will be recalled during the random activation of cell
assemblies during the dream (REM) state but this time they will be
"unlearned". One might speculate that this is the reason why we
don't remember our dreams except for the ones that we vividly
remember. It looks like there are still a few puzzles left to
figure out how our brain does what it does.
Salmons And Human Brain Aging, Science Daily
A University of Colorado at Boulder study of
landlocked salmon indicates they possess a genetically programmed
"aging clock" timed by reproduction, which may provide insight
into human aging and Alzheimer's disease.
Richard Jones, a professor emeritus in the environmental,
population and organismic biology department, said the study is
the first ever to identify deposits of a peptide known as beta
amyloid in the brains of an aging, wild vertebrate population
under natural conditions. Sticky deposits of beta amyloid, called
plaques, are considered one of the hallmarks of aging and
Alzheimer's disease in human brains.
Many neurologists believe that beta amyloid -- produced when
a brain protein known as amyloid precursor protein, or APP, is
chopped into pieces by enzymes -- causes brain neurons to
degenerate and die, said CU-Boulder doctoral student and lead
study author Tammy Maldonado. But other researchers now believe
beta-amyloid deposits in the brain are relatively harmless and may
even be beneficial.
The CU-Boulder biologists found that specific brain regions
of spawning salmon exhibit neurodegeneration and amyloid plaques
remarkably similar to those in humans. But other brain areas used
for migration and spawning tasks continued to function, despite
the presence of beta-amyloid plaques, Maldonado said.
Both salmon and humans exhibit remarkably similar aging
symptoms, including brain decay, cardiovascular disease, muscle
atrophy, skin lesions and the resorption of internal organs.
Laboratory studies of APP and beta-amyloid molecules obtained from
salmon brains and from a small piece of brain tissue from a human
who died with Alzheimer's disease showed the molecules "to be very
similar if not identical," said Maldonado.
In the study, young, castrated kokanee salmon were shown to
live to be 7 years to 9 years old, instead of dying at age 2 or 3
like normal, spawning kokanee salmon, suggesting that salmon have
an "aging alarm" timed to go off at reproduction. Massive surges
of a stress hormone known as cortisol occur in both reproducing
and sterile salmon just prior to the onset of the rapid aging
process and subsequent death, Norris said.
"Cortisol surges may help these fish metabolize sugar and
produce enough energy to locate their home streams and reproduce,
but the surges also eventually may trigger brain aging and death,"
said Maldonado. The researchers plan to inject juvenile salmon
with cortisol and several other hormones next fall to see if the
experiment causes amyloid plaques to form in the brains of the
Lab experiments have shown that high levels of cortisol
spikes that occur in the blood of aging rats and humans, are
present in higher quantities in Alzheimer victims, and are known
to kill certain areas of neurons in the brain.
"If we find that stress hormones cause amyloid plaques to
form in salmon brains, that would be quite a breakthrough," said
Do Flies Have To Go To Sleep, Too?, Science
There is a saying in physics that you don't understand
what you are doing unless you are able to explain it to a child.
It seems that finally science has a halfway plausible answer to
one of the 3-year-old classics: "Do flies have to go to sleep,
too?" Even the standard scientist's disclaimer: "That depends on
what you mean by 'sleep'." can be answered in a way that makes
sense: Sleep can be described as a behavioral state of rest with
an increased arousal threshold, in other words: if you sleep it is
harder to wake you up. The second criterion is that if you didn't
get enough sleep you have a desire to catch up on your sleep,
something that experts call "homeostatic control".
Shaw et al. applied these criteria in a number of careful
experiments to the fruit fly. The first challenge is to quantify
and measure the state of "rest" in flies. The researchers came up
with a combination of ultra sound and infrared devices that would
record the movements of the flies or the absence thereof during
states of rest.
Their experimental answer to the simple question about sleeping
flies is a simple: "Yes, they do." When they sleep it takes about
a hundred times the effort (in the form of vibrations) to startle
them compared to their waking state. In the same way the
researchers could show that sleep-deprived flies were really tired
and needed more rest. An interesting side-result is that flies
seem to respond to drugs just like people: coffee at night keeps
them awake and sleeping pills put them to sleep.
Because of the surprising similarities in sleep behavior of
flies and mammals one might expect that sleep patterns are
genetically controlled. Indeed Shaw et al. could confirm that
several "waking" genes can be found in the fly that are similar to
those found in rats. They are predominantly expressed during the
first few hours of waking. These results make Drosophila a prime
candidate to become a realistic model to study the genetic
foundations of sleep functions.
It is unlikely that these findings will silence a curious
3-year-old for very long. The next question probably is:" But do
flies have dreams, too?"
Extreme Acid Loving Microbes, Science
In the past few years a number of reports were published
about life that has adapted to extreme conditions. The first
living cells are believed to have emerged under conditions without
sunlight in the deep ocean that would be considered extreme today.
Edwards et al. describe a new species of archaea bacteria that
grows in hot places with acid concentrations that would burn human
tissue instantly. These truly extreme-loving bacteria thrive at
the highest naturally observed acid concentrations, at a pH level
of zero, the strongest acid concentration that was ever reported
to support life. These extreme creatures are found deep in a
mineshaft inside of Iron Mountain, California at temperatures of
40 degrees Celsius. If they are put in regular tap water they
disintegrate. Nevertheless they can be found all over the world.
How these bacteria manage such a global distribution is still a
mystery (Maybe in the acidic stomachs of animals?) It is also an
open puzzle how these microbes manage to withstand these extreme
acid concentrations. The researchers expect that it must have to
do with special membrane characteristics since the bacteria found
in Iron Mountain do not possess any cell walls. Maybe these
microbes have found a some tricks that might be interesting for
Unquestioned, however, is the contribution of these acid-loving
bacteria to the global environment: They play a major role in the
global cycling of iron and sulfur.
Microbe Thrives at pH
0, Elizabeth Pennisi,
Science Vol. 287, (5459) 1731, 2000
Archaeal Iron-Oxidizing Extreme Acidophile Important
In Acid Mine Drainage,
Katrina J. Edwards, Philip L. Bond, Thomas M.
Gihring, Jillian F. Banþeld, Science Vol. 287,
Pollution Stops Rain and Snow In Clouds, Science
Clouds form one of the sensitive mechanisms in the
control of weather and climate dynamics. Since they are associated
with phase transitions of water between all its three states (ice,
liquid, vapor) their formation and dissolution is associated with
large conversions of thermal energy. Since phase transitions are
critical phenomena, they can be triggered by changes in
microscopic factors. In the case of cloud formation the size of
Cloud Condensation Nuclei (CCN) plays that critical role. If CCNs
are absent, clouds will not form even if the dew point is reached
i.e. humidity is high enough and temperature is low enough that
vapor would condense into water droplets. If CCNs are too small,
the droplet will not grow big enough to cause precipitation in the
form of rain or snow.
In general CCNs consist of aerosols of different origins. It
has been known -for instance from the satellite based Tropical
Rainfall Measurement Mission (TRMM)- that smoke from burning
vegetation produces plumes of small aerosols that are not very
efficient in forming droplets of size large enough to produce
There has been a debate about the impacts of human activity on
cloud dynamics. Reports about enhanced rainfall over cities have
been associated with large CCNs but there are other factors that
can lead to enhanced rainfall such as the heat-island effect and
increased friction of urban areas. Observation of ship tracks in
marine stratocumulus clouds, however, showed that CNNs from ship
stacks redistribute water into small droplets and thus suppress
Rosenfeld could confirm these results with the help of
satellite data that reveal that plumes of reduced cloud particle
size and suppressed precipitation originate from major urban areas
and industrial facilities.
The impact of this observation on climate change can be
significant: It is known that water vapor is one of the most
potent greenhouse gases. Reduction of rainfall could therefore
lead to a significant heating of the atmosphere. Since this effect
is not uniform but concentrated around populated areas, one can
also expect other impacts for instance on the formation of storms.
It is hard to get used to seeing kids, too young to rent
a car starting e-companies and making so much money that they can
to buy private jets instead. Meyer gives a theoretical analysis of
this new world with a connected economy of increasing returns
where rivals can "instantaneously clone competitors' killer
innovations" (unless they are patented or otherwise protected by
law). He explains that it is very difficult today to gain a "
sustainable competitive advantage" in the sense of making higher
than normal profits over on extended period of time.
In the good old days of diminishing returns the system had a
built in stabilizing factor or negative feedback in the form of
natural economic limitations like the limited availability of
farmland etc. In the information industry those natural limits
don't exist and it is no problem at all for a single company to
supply 100% of a certain software product for all computers in the
world. The same is true for movies or video games especially if
they are not tied to substances like celluloid or game computers
(e.g. Sony's PS2): information in the form of software or data can
be reproduced at basically no marginal cost. This clearly creates
an incentive for mega-mergers and "winner-takes-all" markets with
In the context of non-linear dynamical systems this would be a
positive feedback situation with one globally attracting mode or
order parameter. If the fitness of the resulting system is reduced
it becomes unstable against slowly changing parameters due to
adaptation. Meyer mentions that subtle aspects of the rules of
baseball have been continually changed to keep the balance of the
game in such a way that it remains interesting for the audience.
(Being from Germany I have always wondered why baseball bats are
legal with which the balls can be hit outside of the playing field
where they might injure a spectator; it must be one of the thrills
of the game.)
Just in the same way Meyer argues the government slowly changes
the rules of the market place in updating antitrust regulations in
the interest of the consumers. The ruling against Microsoft and
the patenting of genetic information are just two examples of
stabilizing changes in the boundary parameters of an economy with
networked "eco-systems" of knowledge-based firms.
Iridium Global Satellite System Terminated, Wall Street Journal
The speed at which the Internet found global acceptance
is has been seen as sign for global self-organization and a
transition to a new phase for the global economy. It seemed to be
obvious that technological developments that support this trend
will be a good investment. The creation of a network of low earth
orbiting satellites seemed to be one of those examples that would
allow global access to information around the clock independent of
where you are. From a cell-phone like simple hand-set one could
directly uplink to one of a couple of dozens satellites and be
connected to the global information system. The first set of
satellites was launched just a few years ago under the brand name
Iridium. Today it becomes obvious that a great idea for global
communication with cutting edge technology is not necessarily
economically sustainable, the Iridium project has been declared
bankrupt. This is bad news for business and adventure travelers in
areas with poor telephone service (like Antarctica).
On the other hand it demonstrated the technical feasibility of
global communication for rural communities in the third world to
leapfrog past the industrial revolution and the migration into
city slums and directly connect to the information economy. It
shows that it is technically possible to use global satellite
communication for the education of women in third world countries
a factor that has been identified as the single most important
leverage factor to limit over-population, hunger and a number of
other urgent third world problems.
It is sad that this technical opportunity to make global
changes in the third world will be missed and, instead, 66
perfectly fine satellites are going to be "deorbited" to burn up
in the atmosphere. But maybe this evolutionary dead-end will make
place for even better and cheaper solutions: Motorola mentions
"High Frequency Single Sideband Radio Systems" as an alternative
to the Iridium system for long-range communications of thousands
of kilometers for voice, data, e-mail and fax. (http://www.motorola.com/satellite/info/)
The Sims And Agent Based Modeling, ComDig 2000.9.6
I read "The Sims And Agent Based Modeling" of the current
Digest with great interest. Glad to see that also a more playful
viewpoint to ALive/complexity is being accepted now. I wonder if a
tool such as AgentSheets also would be of interest to the
Complexity community? AgentSheets' is not a substitute for a tool
like Swarm but on the other hand it is end-user programmable and
has been used by thousands of non-programmers to build pretty
sophisticated simulations (e.g.,. http://www.apple.com/education/LTReview/fall99/agentsheets/index.html).
Here are some simulations built: http://www.agentsheets.com/showcase.html.
There are extensions available to build more complex applications
including a Sugarscape language kit.
Letters To The Editor:
A Computer To Outsmart A Raging Fire, ComDig 2000.9.7
Peter Jung, Physics Department, Ohio State University
points out a sloppy use of the terms "meta-stable" and "excitable"
2000.9.7 (A Computer To Outsmart A Raging Fire, NY Times). He
writes: "A metastable system falls from its energetic higher state
into the globally stable state and will stay there (like the
buildings or the tanker). The activation energy can be small but
the gain can be large.
In contrast an excitable system will automatically travel back
to its original rest state after being excited."
That means the property of the system to restore its
meta-stable state is part of the definition of "excitable system".
For instance a forest could be considered but isolated buildings
are only meta-stable. On the other hand a city could be considered
"excitable system" if burned down buildings are eventually
restored by new ones.