On this page is an evaluation of the work of Werner Gitt in light of the Ev program.
Yes, paradoxical though it may sound, considered from the point of view of information theory, a random sequence of letters possesses the maximum information content, whereas a text of equal length, although linguistically meaningful, is assigned a lower value.Here Gitt falls into a standard misunderstanding Information Is Not Entropy, Information Is Not Uncertainty! That is, he forgets to subtract.
Theorem 1: The statistical information content of a chain of symbols is a quantitative concept. It is given in bits (binary digits).This is a definition, not a theorem. Gitt does not prove it.
Theorem 2: According to Shannon's theory, a disturbed signal generally contains more information than an undisturbed signal, because, in comparison with the undisturbed transmission, it originates from a larger quantity of possible alternatives.This is incorrect. Gitt has fully fallen into the pitfall (see above) and is stuck. He has confused H_{before} with information. From now on he is doomed. In this case, he directly contradicts Shannon's own theorem and writings! That is, Shannon used the fact that a disturbance decreases the information to prove his theorem! See Shannon1948 Part II, Section 11, PART II: THE DISCRETE CHANNEL WITH NOISE 11. REPRESENTATION OF A NOISY DISCRETE CHANNEL. Page 21, Figure 8.
Theorem 3: Since Shannon's definition of information relates exclusively to the statistical relationship of chains of symbols and completely ignores their semantic aspect, this concept of information is wholly unsuitable for the evaluation of chains of symbols conveying a meaning.This is a claim, not a theorem. It is based on 'meaning', an undefined term.
Theorem 5: The assignment of the symbol set is based on convention and constitutes a mental process.This claim is disproven by the code generated by the Ev program.
Theorem 4: A code is an absolutely necessary condition for the representation of information.and Theorem 8:
Only those structures that are based on a code can represent information (because of Theorem 4). This is a necessary, but still inadequate, condition for the existence of information.are two "theorems" that depend on each other! It's a nice case of circular (non)reasoning.
If, for example, a basic code is found in any system, it can be concluded that the system originates from a mental concept.Presumably, 'mental concepts' (another ill defined term) must come from humans or other intelligences, then This claim is disproven by the Ev program. The program starts with no code and evolves one independently of humans. So this statement is incorrect.
Meanings always represent mental concepts; we can therefore further state:While many people have attempted to define 'meaning', none have done so mathematically. "Theorem 10" is shown to be incorrect since Ev generates information from mutation, replication and selection. The 'meaning' in Ev (and biology in general) is, if anything, functional. That is, the patterns work such that the organism survives. No 'mind' is involved, so the conclusion that an intelligent information source is required is false. The environment provides selection and hence is effectively the source of the information (in the loose English sense).
Theorem 10: Each item of information needs, if it is traced back to the beginning of the transmission chain, a mental source (transmitter).
Theorems 9 and 10 basically link information to a transmitter (intelligent information source).
SC2: A sequence of symbols does not represent information if it is based on randomness.This 'sufficient condition' contradicts his 'theorem' 2! (Recall that Theorem 2 claims that information is randomness.) Boy is he confused!
6. No information chain can exist without a mental origin.This is demonstrated to be incorrect by the Ev program.
Schneider Lab
origin: 2005 May 05
updated: 2011 Aug 24