I have landed—in China

Jerry Coyne in China…

Why Evolution Is True

I’m in Dongguan, China, a very large industrial city (8 million, larger than all the areas of Hong Kong put together), which, like most Chinese cities, is growing rapidly, has construction everywhere, and is beset by smog. On the good side, I’ve learned a lot about modern China in only one day from expats who live here, I have lovely hosts taking good care of me, and have had two awesome meals (photos later).

Today I tour the rural areas, where I’m told I’ll get to see a slice of the slow-paced rural China that is, by government policy, rapidly disappearing. This afternoon I’ll lecture to the students at the local international school on religion and “ways of knowing” (the first time I’ve ever talked on the latter subject).

Here’s where I am, a short distance from Hong Kong:

dongguan-guangdong

After that it’s back to Hong Kong (about 2 hours from…

View original post 64 more words

Spooky Action At a Distance and Superluminal Transmission

Recent loophole free confirmation of John Bell’s test results for confirming spooky action at a distance mean that the question of instantaneous information transmission is a live issue. It has serious practical and philosophical ramifications. For example – Nick Bostrom’s simulation arguments may be reinforced by the idea that the substrate of the physical world is configured such that a physical variable at one point is associated with a physical variable at what we would empirically detect as distant, but with no detectable finite causal interaction via causal structures between them, and with apparently instantaneous synchronisation. This is just the kind of thing that one can bring about in a 3D computer game: two objects can be synchronised by the underlying code in a way that would be unnatural in the physical world.

The popularization of the shortcomings of string theory (significantly reduced since M-Theory was introduced by Ed Witten in 1995) and the work of science popularisers like Lawrence Krauss, Peter Woit, John Gribbin, and  Neil DeGrasse Tyson have co-incided with a mounting interest in metaphysics and the philosophy of physics among physicists and philosophers alike. The loophole free confirmation of Bell’s theorems, however, may well be more significant than Hawking’s discovery of black hole radiation.

Spooky Action At A Distance
When Einstein produced his theory of special relativity and work on quantum mechanics, he made a disturbing discovery – what he called “spooky action at a distance”. Einstein’s Spooky action at a distance is better known among physicists as non-local effects or quantum entanglement.

What is entanglement and what does it have to do with information transmission? Information transmission depends upon cause and effect. If you can make cause and effect happen instantaneously, then arguably so does information transmission.

Briefly, Einstein’s theory predicted that two quantum systems (very small systems) like photons or electrons or other particles would affect each other’s states instantaneously even if they we separated by large distances. The systems (particles) start out together in the same place in space and time, and then can become widely separated. After they are separated by a significant non-trivial distance – a measurement of one that changes its state will necessarily result in an instantaneous change in the state of the other in the opposite direction (quantum particles have different rates of spin and what is called angular momentum, and there is a direction of that spin – usually up or down.) If one particle is spin up, and you measure it and it becomes spin down – then the other particle will do the opposite even if has traveled a long way off.  Instantaneously – with no delay at all. Spooky, said Einstein.

Now, instantaneously here literally means instantaneously. Not at the speed of light, or near it, but much much faster. In fact – speed or velocity is not even the right thing to talk about. The cause-effect of entanglement or non-local effects is immediate. It would be like the pitcher throwing the baseball, while the batter hits exactly the same ball at exactly the same time – and the pitcher is on Earth and the batter on the moon.

There is theoretically and practically no speed – just instantaneous change of the state of one physical system based upon the change in the state of the other (usually when the other system is physically measured.) It’s almost like if a fan watches the baseball pitcher, then the action at a distance will automatically happen. If the pitcher is pitching, you know the batter is batting at that moment – with the same ball. If the fan goes and hands the pitcher a bat, there is extremely high likelihood (virtual certainty) that the distant batter will have become a pitcher at that exact same moment. It is THAT weird.

Einstein did not like spooky action at a distance at all because it suggested that the usual understanding of cause and effect and causal chains in physics was largely wrong for quantum mechanics. Either something was wrong with the mathematics, said Einstein, or there was something very weird going on in the universe. He came to the conclusion that there must be intermediate causal structures between the entangled quantum systems that had not been detected physically yet. This theory is called hidden local variable or local realism theory. The realism means that there is really something there doing the entangling, and local means that spooky action at a distance just does not happen but instead there is a hidden intermediate causal structure that is local to the quantum systems.

A Speed for Entanglement After All


Now many readers will be aware of Einstein’s maxim that no body without zero rest mass (no mass when not moving relative to any spatiotemporal frame of reference) can move faster than the speed of light. That’s an immutable law of the universe – right? Well – maybe. It is action at a distance. Einstein’s hidden local variables have not been found – and in fact the theory has turned out to be unsupported by empirical experimental findings.

Recently some Chinese physicists have tried to measure the speed of non-local effects (refer to the list of reference at Physics News.) I have just said that there is no speed involved – so what is this experiment about? Well, their findings don’t provide much comfort. What they proved experimentally – assuming no discovery of errors in the future – is that if there is a speed of entanglement then it has to be at least 10 000 times the speed of light. They do not know if it is the limitations of their equipment that is causing the measurement value. The speed might be even higher – or no speed at all as suggested above.

Bell Theorems- Is Spooky Action at a Distance Real? Or are there hidden local intermediate causal structures?

The theoretical physicist John Bell made things even worse for Einstein in 1964 with a theory that suggested that the mathematical predictions of quantum mechanics did not fit with the mathematical theory of hidden local variables that he himself had developed.

Things got worse still for the hidden local variable theory when Bell’s findings were supported by experiment in 1972 by John Clauser and Stuart Freeman. Alain Aspect did it again with experiments in 1981.

What Now – Superluminal Information Transfer?

In the best mathematical and scientific theories of information, information transmission involves loss due to signal noise and is limited by the transmission rates permitted by the transmission medium. Some information theorists assert that information transfer is only about the covariance – the simultaneous changing – of one structure with another in such a way that the state of one system (an information receiver) tells one something about the state of the other (the information source) with a certain degree of probability.

Now, normally entanglement is not regarded by physicists as an information channel on a statistical basis since there is no uncertainty about the state of one system if the state of the other is known. For statistical formal measures of information one requires statistical uncertainty – because according to those measures information just is a reduction in uncertainty or an increase in probability about the next state of the source based on the current state, or else based upon signals received that were caused by the current state. (See John Gray’s text Information Theory and Entropy: http://ee.stanford.edu/~gray/it.pdf ; See also Warren Weaver’s introduction to Claude E Shannon’s The Mathematical Theory of communcation.)

I will put aside this consideration of the statistical conception of information as an impediment to quantum entanglement channels for information transmission. This is because the statistical conception is only one (albeit very important) element of the transmission of information, and only one conception of information transfer.

Whither the Second Law of Thermodynamics: Entropy defeated?

The point is that if entanglement is a real physical causality – if it really involves some kind of instant physical cause effect interaction, then that means that information can be transferred instantaneously. If that is true, then many things are unclear. Because of entropy and the second law of thermodynamics, causality is limited in a causally closed universe. Energy loss and impedance limit transmission speeds in predictable ways.

However, if non-local quantum information transmission is true, then it looks very much like we might be able to send information without any signal loss at all. Even weirder – information might be transmissable with no intermediate causal pathway or structure. This is spooky indeed.

References:

J. S. Bell, (1966On the problem of hidden variables in quantum mechanics, Rev. Mod. Phys. 38, 447. 

Are There Informational Laws in Genome Evolution?

Republished from Ontic Cafe 2014

 

This post is the first of two. The material is from a quite complex field of the philosophy of biology and information theory in biology. Nonacademic philosophers should be able to get a reasonable idea what is going on and benefit from an introduction to one of the hottest topics in the philosophy of information and biology.

Different Conceptions of Information
 
Let’s start with a rapid introduction to the philosophy of information. There are several conceptions of the nature of information – of what information actually is. These conceptions vary dramatically in their details and ontological commitments – the things that are taken to be necessary to have for there to exist some information (in philosophical language we day “the necessary conditions” for information to exist.) Here a couple of quick examples will be instructive.
The most common understanding, and the most common scientific one, is that of quantitative information theories. In these theories one has information on a statistical or probabilistic basis. According to these conceptions information exists when there is a reduction in uncertainty about what is happening at an information source. An information source is any physical process that can be modeled statistically – about which you can say there is a certain probability of the next state of the source based on the current one. A simple example is you reading this sentence. Each word makes the next word more or less likely because of the structure of the English language and the rules (grammar and meaning) for making English sentences. The source is the text you are reading. This is the very example most used by the founder of modern quantitative information theory – mathematician Claude E. Shannon (The Mathematical Theory of Communication, 1948.)
The main alternatives to quantitative statistical theories are algorithmic theories. These involve measuring the complexity of strings of symbols or what are called data objects. Any sequence of elements can be a data object. The longer and more complex the data object – the more information it has. The most famous is that developed  by Russian materialist mathematician Andre Kolmogorov. In Kolmogorov’s theory the amount of information in a data object is given by the length of the program or description required to generate or construct the data object.
Semantic Information
 
Quantitative statistical measure based conceptions and definitions of information have often been seen as inadequate because as Claude Shannon himself wrote in The Mathematical Theory of Communication, they do not attempt to capture any meaning of the symbols that are transmitted. His predecessor R.V.L Hartley wrote that “[i]t is desireable therefore to eliminate the psychological factors involved and to establish a measure of information in terms of purely physical quantities” (Transmission of Information, 1928, 536.)
Shannon’s peer and mentor Warren Weaver first observed that in future it would be desirable to formulate a conception of information that accounted for meaning. Later theorists came to refer to such conceptions as theories of semantic information.  There have been several of these – mostly naturalistic – offered by both mathematicians and philosophers. The first notable attempt was by the famous Vienna circle mathematician and philosopher Rudolph Carnap. Carnap joined with mathematician Yehoshua Bar-Hillel to formulate a theory of semantic information in which the semantic information content of a sentence was determined according a to a logical formulation (1953.) In lay terms the information content of a sentence is the set of all sentences that are false if that sentence is true.
Later various other conceptions of semantic information.  Philosopher Fred Dretske adapted elements of Shannon’s theory (1981 – Knowledge and the Flow of Information.) Mathematician Keith Devlin produced another logical conception (1995- Logic of Information.) More recently, Luciano Floridi has produced a theory of semantic information that extends and adapts ideas put forward by Devlin and Bar-Hillel and Carnap. It is different in that it requires information to have alethic value – to be based upon data which are truthful according to certain fairly complex criteria (Floridi, Information in The Blackwell Guide to the Philosophy of Computing and Information – 2004, Information – A Very Short Introduction – 2011, The Philosophy of Information – 2012.)
The idea of semantic theories of information is that information and meaning are directly related somehow. Usually meaning is thought to involve truth value of some kind.
Meanwhile in Physics and Biology
 
An enormous part of the story of our understanding of the nature of information comes from physics. I will not say much about that here, except to say that physicists often regard information to be a physical thing. Another pioneer of information theory – the father of Cybernetics Norbert Weiner – once said that “information is information, not matter or energy…no materialism that does not admit this can survive…” (1962, Cybernetics: or Control and Communication in the Animal and the Machine.) No physicist has claimed that information is matter or energy, but quantum computing pioneer Rolf Landauer was sure that it is physical (Information is a Physical Entity, 1996.)
An enormous amount of philosophical and technical thought about information comes from biology. This is not so surprising given the importance of the concept of information to genetics and DNA science. Inherited traits from one generation to the next of phenotypes (organisms) are described in terms of information. So is what is referred to as the central dogma of molecular biology: that information cannot go from the phenotype (the developed body) to the genotype (the gene/DNA.) In other words, if I cut my hand it will not mean that any child conceived by me in the future will have the same cut on their hand. More recently the central dogma has come under challenge from the field of epigenetics. In epigenetics, other things in addition to the gene – the DNA itself – are thought to contribute heritable information or information that is passed from one generation to the next. This can include processes within the cytoplasm of the cell, or even things in the organisms environment like the structure of nests in which young are reared. Still – it is often information transmission that is of interest.
At least since Crick and Watson’s discovery of the double helix structure of DNA in 1971, biologists and philosophers of biology have been contemplating and arguing about the nature of information and information transfer in DNA and biosynthetic processes. Biosynthetic processes are processes in which smaller molecules are combined to form more complex molecules that have some more complex function (processes involving such things as the manufacture of protein and other biological structures from genetic material.)  Such processes are frequently described in terms of information.
Codes, encoding, transmission, and even information compression have been discussed as real in the processes of genetic material.
This all raises a question, however. We saw in the previous section that there are many conceptions of information. So which is the right one for biology? Molecular bioscientists and philosophers of biology are still trying to figure that out. There are even arguments about whether genetic information is semantic or not – if it has meaning and if so in what way (See recent work by Nicholas Shea on what he calls Infotel semantics. The idea is that the meaning of genetic information is determined by its function.) Some philosophers of biology even have what is known as an eliminative conception of information in biology: they eliminate it from the discussion completely or partly as a useless metaphor that is confusing and does not explain anything real (See Griffiths, Paul E. Genetic Information – A Metaphor in Search of a Theory http://philsci-archive.pitt.edu/89/1/Genetic_Information_etc.pdf.)
Are There Informational Laws in Genome Evolution and the Evolution of Protein Synthesis?
 
This entire area of the nature of information in molecular bioscience is complex and keenly debated. However, in this two part series I am interested in a very specific part of the debate – one that is perhaps the most exciting and relevant to philosophy in general and not a little evolutionary science today. It involves the question of how protein synthesis evolved by natural selection. The process of protein synthesis is an incredibly complex biosynthetic process that has only recently come to be well understood. The complexity of the processes of protein folding and gene splicing meant that the details of these processes were wholly mysterious up until recently. How such processes came to evolve naturally to their current state is an even more challenging mystery.
Above is an artist’s representation of the proces of protein synthesis from DNA via processes of DNA transcription and translation into a chain of amino acids and finally into a folded protein. The process is staggeringly complex, with only the most basic fundamental steps represented here. Molecular bioscientists usually take it for granted that there is information transmitted form the DNA to the protein. A much larger question, however, is how the information of the entire process and the structures involved in it came to be as it is by evolutionary processes. Eugene V. Koonin has proposed that “Although a complete physical theory of evolutionary biology is inconceivable, the universals of genome evolution might qualify as “laws of evolutionary genomics” in the same sense “law” is understood in modern physics.” (http://www.ploscompbiol.org/article/info%3Adoi%2F10.1371%2Fjournal.pcbi.1002173) The details of this theory involve the laws being expressed largely as statistical and informational.