Citation for this page in APA citation style.           Close
See More


Philosophers

Mortimer Adler
Rogers Albritton
Alexander of Aphrodisias
Samuel Alexander
William Alston
Anaximander
G.E.M.Anscombe
Anselm
Louise Antony
Thomas Aquinas
Aristotle
David Armstrong
Harald Atmanspacher
Robert Audi
Augustine
J.L.Austin
A.J.Ayer
Alexander Bain
Mark Balaguer
Jeffrey Barrett
William Barrett
William Belsham
Henri Bergson
George Berkeley
Isaiah Berlin
Richard J. Bernstein
Bernard Berofsky
Robert Bishop
Max Black
Susanne Bobzien
Emil du Bois-Reymond
Hilary Bok
Laurence BonJour
George Boole
Émile Boutroux
Daniel Boyd
F.H.Bradley
C.D.Broad
Michael Burke
Jeremy Butterfield
Lawrence Cahoone
C.A.Campbell
Joseph Keim Campbell
Rudolf Carnap
Carneades
Nancy Cartwright
Gregg Caruso
Ernst Cassirer
David Chalmers
Roderick Chisholm
Chrysippus
Cicero
Tom Clark
Randolph Clarke
Samuel Clarke
Anthony Collins
Antonella Corradini
Diodorus Cronus
Jonathan Dancy
Donald Davidson
Mario De Caro
Democritus
Daniel Dennett
Jacques Derrida
René Descartes
Richard Double
Fred Dretske
John Dupré
John Earman
Laura Waddell Ekstrom
Epictetus
Epicurus
Austin Farrer
Herbert Feigl
Arthur Fine
John Martin Fischer
Frederic Fitch
Owen Flanagan
Luciano Floridi
Philippa Foot
Alfred Fouilleé
Harry Frankfurt
Richard L. Franklin
Bas van Fraassen
Michael Frede
Gottlob Frege
Peter Geach
Edmund Gettier
Carl Ginet
Alvin Goldman
Gorgias
Nicholas St. John Green
H.Paul Grice
Ian Hacking
Ishtiyaque Haji
Stuart Hampshire
W.F.R.Hardie
Sam Harris
William Hasker
R.M.Hare
Georg W.F. Hegel
Martin Heidegger
Heraclitus
R.E.Hobart
Thomas Hobbes
David Hodgson
Shadsworth Hodgson
Baron d'Holbach
Ted Honderich
Pamela Huby
David Hume
Ferenc Huoranszki
Frank Jackson
William James
Lord Kames
Robert Kane
Immanuel Kant
Tomis Kapitan
Walter Kaufmann
Jaegwon Kim
William King
Hilary Kornblith
Christine Korsgaard
Saul Kripke
Thomas Kuhn
Andrea Lavazza
Christoph Lehner
Keith Lehrer
Gottfried Leibniz
Jules Lequyer
Leucippus
Michael Levin
Joseph Levine
George Henry Lewes
C.I.Lewis
David Lewis
Peter Lipton
C. Lloyd Morgan
John Locke
Michael Lockwood
Arthur O. Lovejoy
E. Jonathan Lowe
John R. Lucas
Lucretius
Alasdair MacIntyre
Ruth Barcan Marcus
Tim Maudlin
James Martineau
Nicholas Maxwell
Storrs McCall
Hugh McCann
Colin McGinn
Michael McKenna
Brian McLaughlin
John McTaggart
Paul E. Meehl
Uwe Meixner
Alfred Mele
Trenton Merricks
John Stuart Mill
Dickinson Miller
G.E.Moore
Thomas Nagel
Otto Neurath
Friedrich Nietzsche
John Norton
P.H.Nowell-Smith
Robert Nozick
William of Ockham
Timothy O'Connor
Parmenides
David F. Pears
Charles Sanders Peirce
Derk Pereboom
Steven Pinker
U.T.Place
Plato
Karl Popper
Porphyry
Huw Price
H.A.Prichard
Protagoras
Hilary Putnam
Willard van Orman Quine
Frank Ramsey
Ayn Rand
Michael Rea
Thomas Reid
Charles Renouvier
Nicholas Rescher
C.W.Rietdijk
Richard Rorty
Josiah Royce
Bertrand Russell
Paul Russell
Gilbert Ryle
Jean-Paul Sartre
Kenneth Sayre
T.M.Scanlon
Moritz Schlick
John Duns Scotus
Arthur Schopenhauer
John Searle
Wilfrid Sellars
David Shiang
Alan Sidelle
Ted Sider
Henry Sidgwick
Walter Sinnott-Armstrong
Peter Slezak
J.J.C.Smart
Saul Smilansky
Michael Smith
Baruch Spinoza
L. Susan Stebbing
Isabelle Stengers
George F. Stout
Galen Strawson
Peter Strawson
Eleonore Stump
Francisco Suárez
Richard Taylor
Kevin Timpe
Mark Twain
Peter Unger
Peter van Inwagen
Manuel Vargas
John Venn
Kadri Vihvelin
Voltaire
G.H. von Wright
David Foster Wallace
R. Jay Wallace
W.G.Ward
Ted Warfield
Roy Weatherford
C.F. von Weizsäcker
William Whewell
Alfred North Whitehead
David Widerker
David Wiggins
Bernard Williams
Timothy Williamson
Ludwig Wittgenstein
Susan Wolf

Scientists

David Albert
Michael Arbib
Walter Baade
Bernard Baars
Jeffrey Bada
Leslie Ballentine
Marcello Barbieri
Gregory Bateson
Horace Barlow
John S. Bell
Mara Beller
Charles Bennett
Ludwig von Bertalanffy
Susan Blackmore
Margaret Boden
David Bohm
Niels Bohr
Ludwig Boltzmann
Emile Borel
Max Born
Satyendra Nath Bose
Walther Bothe
Jean Bricmont
Hans Briegel
Leon Brillouin
Stephen Brush
Henry Thomas Buckle
S. H. Burbury
Melvin Calvin
Donald Campbell
Sadi Carnot
Anthony Cashmore
Eric Chaisson
Gregory Chaitin
Jean-Pierre Changeux
Rudolf Clausius
Arthur Holly Compton
John Conway
Jerry Coyne
John Cramer
Francis Crick
E. P. Culverwell
Antonio Damasio
Olivier Darrigol
Charles Darwin
Richard Dawkins
Terrence Deacon
Lüder Deecke
Richard Dedekind
Louis de Broglie
Stanislas Dehaene
Max Delbrück
Abraham de Moivre
Bernard d'Espagnat
Paul Dirac
Hans Driesch
John Eccles
Arthur Stanley Eddington
Gerald Edelman
Paul Ehrenfest
Manfred Eigen
Albert Einstein
George F. R. Ellis
Hugh Everett, III
Franz Exner
Richard Feynman
R. A. Fisher
David Foster
Joseph Fourier
Philipp Frank
Steven Frautschi
Edward Fredkin
Augustin-Jean Fresnel
Benjamin Gal-Or
Howard Gardner
Lila Gatlin
Michael Gazzaniga
Nicholas Georgescu-Roegen
GianCarlo Ghirardi
J. Willard Gibbs
James J. Gibson
Nicolas Gisin
Paul Glimcher
Thomas Gold
A. O. Gomes
Brian Goodwin
Joshua Greene
Dirk ter Haar
Jacques Hadamard
Mark Hadley
Patrick Haggard
J. B. S. Haldane
Stuart Hameroff
Augustin Hamon
Sam Harris
Ralph Hartley
Hyman Hartman
Jeff Hawkins
John-Dylan Haynes
Donald Hebb
Martin Heisenberg
Werner Heisenberg
John Herschel
Basil Hiley
Art Hobson
Jesper Hoffmeyer
Don Howard
John H. Jackson
William Stanley Jevons
Roman Jakobson
E. T. Jaynes
Pascual Jordan
Eric Kandel
Ruth E. Kastner
Stuart Kauffman
Martin J. Klein
William R. Klemm
Christof Koch
Simon Kochen
Hans Kornhuber
Stephen Kosslyn
Daniel Koshland
Ladislav Kovàč
Leopold Kronecker
Rolf Landauer
Alfred Landé
Pierre-Simon Laplace
Karl Lashley
David Layzer
Joseph LeDoux
Gerald Lettvin
Gilbert Lewis
Benjamin Libet
David Lindley
Seth Lloyd
Werner Loewenstein
Hendrik Lorentz
Josef Loschmidt
Alfred Lotka
Ernst Mach
Donald MacKay
Henry Margenau
Owen Maroney
David Marr
Humberto Maturana
James Clerk Maxwell
Ernst Mayr
John McCarthy
Warren McCulloch
N. David Mermin
George Miller
Stanley Miller
Ulrich Mohrhoff
Jacques Monod
Vernon Mountcastle
Emmy Noether
Donald Norman
Travis Norsen
Alexander Oparin
Abraham Pais
Howard Pattee
Wolfgang Pauli
Massimo Pauri
Wilder Penfield
Roger Penrose
Steven Pinker
Colin Pittendrigh
Walter Pitts
Max Planck
Susan Pockett
Henri Poincaré
Daniel Pollen
Ilya Prigogine
Hans Primas
Zenon Pylyshyn
Henry Quastler
Adolphe Quételet
Pasco Rakic
Nicolas Rashevsky
Lord Rayleigh
Frederick Reif
Jürgen Renn
Giacomo Rizzolati
A.A. Roback
Emil Roduner
Juan Roederer
Jerome Rothstein
David Ruelle
David Rumelhart
Robert Sapolsky
Tilman Sauer
Ferdinand de Saussure
Jürgen Schmidhuber
Erwin Schrödinger
Aaron Schurger
Sebastian Seung
Thomas Sebeok
Franco Selleri
Claude Shannon
Charles Sherrington
Abner Shimony
Herbert Simon
Dean Keith Simonton
Edmund Sinnott
B. F. Skinner
Lee Smolin
Ray Solomonoff
Roger Sperry
John Stachel
Henry Stapp
Tom Stonier
Antoine Suarez
Leo Szilard
Max Tegmark
Teilhard de Chardin
Libb Thims
William Thomson (Kelvin)
Richard Tolman
Giulio Tononi
Peter Tse
Alan Turing
C. S. Unnikrishnan
Nico van Kampen
Francisco Varela
Vlatko Vedral
Vladimir Vernadsky
Mikhail Volkenstein
Heinz von Foerster
Richard von Mises
John von Neumann
Jakob von Uexküll
C. H. Waddington
John B. Watson
Daniel Wegner
Steven Weinberg
Paul A. Weiss
Herman Weyl
John Wheeler
Jeffrey Wicken
Wilhelm Wien
Norbert Wiener
Eugene Wigner
E. O. Wilson
Günther Witzany
Stephen Wolfram
H. Dieter Zeh
Semir Zeki
Ernst Zermelo
Wojciech Zurek
Konrad Zuse
Fritz Zwicky

Presentations

Biosemiotics
Free Will
Mental Causation
James Symposium
 
Terrence Deacon

Terrence Deacon is professor of Biological Anthropology, Neuroscience, and Linguistics at University of California-Berkeley.

In his 1997 book. The Symbolic Species: The Co-evolution of Language and the Brain, he argued that language coevolved by natural selection with the brain, although he now argues that the major source of language acquisition is social transmission, with a trial-and-error process analogous to natural selection occurring while the brain develops.

Deacon's 2011 work Incomplete Nature has a strong triadic structure, inspired perhaps by an important influence from semiotics—the philosopher Charles Sanders Peirce's triad of icon, index, and symbol. Deacon's triad levels represent the material, the ideal, and the pragmatic. The first two levels reflect the ancient philosophical dualism of materialism and idealism, or body and mind, respectively. The major transition from the nonliving to the living - the problem of abiogenesis, and the introduction of telos in the universe - happens in Deacon's third level.

Teleodynamics is Deacon's name for the third level in his dynamics hierarchy. It is built on and incorporates the two lower levels — the first level is physical and material, the second adds an informational and immaterial aspect.

At the bottom level is the natural world, which Deacon characterizes by its subjection to the second law of thermodynamics. When entropy (the Boltzmann kind) reaches its maximum, the equilibrium condition is pure formless disorder. Although there is matter in motion, it is the motion we call heat and nothing interesting is happening. Equilibrium has no meaningful differences, so Deacon calls this the homeodynamics level, using the root homeo-, meaning "the same." There are no meaningful differences here.

At the second level, form (showing differences) emerges. Deacon identifies a number of processes that are negentropic, reducing the entropy locally by doing work against and despite the first level's thermodynamics. This requires constraints, says Deacon, like the piston in a heat engine that constrains the expansion of a hot gas to a single direction, allowing the formless heat to produce directed motion.

Atomic constraints such as the quantum-mechanical bonding of water molecules allow snow crystals to self-organize into spectacular forms, producing order from disorder. Deacon dubs this second level morphodynamic. He sees the emerging forms as differences against the background of unformed sameness. His morphodynamic examples include, besides crystals, whirlpools, Bénard convection cells, basalt columns, and soil polygons, all of which apparently violate the first-level tendency toward equilibrium and disorder in the universe. These are processes that information philosophy calls ergodic.

Herbert Feigl and Charles Sanders Peirce may have been the origin of Bateson's famous idea of a "difference that makes a difference."
On Deacon's third level, "a difference that makes a difference" (cf. Gregory Bateson and Donald MacKay) emerges as a purposeful process we can identify as protolife. The quantum physicist Erwin Schrödinger saw the secret of life in an aperiodic crystal, and this is the basis for Deacon's third level. He ponders the role of ATP (adenosine triphosphate) monomers in energy transfer and their role in polymers like RNA and DNA, where the nucleotide arrangements can store information about constraints. He asks whether the order of nucleotides might create adjacent sites that enhance the closeness of certain molecules and thus increase their rate of interaction. This would constitute information in an organism that makes a difference in the external environment, an autocatalytic capability to recruit needed resources. Such a capability might have been a precursor to the genetic code.

Deacon crafts an ingenious model for a minimal "autogenic" system that has a teleonomic (purposeful) character, with properties that might be discovered some day to have existed in forms of protolife. His simplest "autogen" combines an autocatalytic capability with a self-assembly property like that in lipid membranes, which could act to conserve the catalyzing resources inside a protocell.

Autocatalysis and self-assembly are his examples of morphodynamic processes that combine to produce his third-level, teleodynamics. Note that Deacon's simplest autogen need not replicate immediately. Like the near-life of a virus, it lacks a metabolic cycle and does not maintain its "species" with regular reproduction. But insofar as it stores information, it has a primitive ability to break into parts that could later produce similar wholes in the right environment. And the teleonomic information might suffer accidental changes that produce a kind of natural selection.

Deacon introduces a second triad he calls Shannon-Boltzmann-Darwin (Claude, Ludwig, and Charles). He describes it on his Web site www. teleodynamics.com (sadly no longer active). I would rearrange the first two stages to match his homeodynamic-morphodynamic-teleodynamic triad. This would put Boltzmann first (matter and energy in motion, but both conserved, merely transformed by morphodynamics). A second Shannon stage then adds information (Deacon sees clearly that information is neither matter nor energy); for example, knowledge in an organism's "mind" about the external constraints that its actions can influence.

This stored information about constraints enables the proto-organism in the third stage to act in the world as an agent that can do useful work, that can evaluate its options, and that can be pragmatic (more shades of Peirce) and normative. Thus Deacon's model introduces value into the universe— good and bad (from the organism's perspective). It also achieves his goal of explaining the emergence of perhaps the most significant aspect of the mind: that it is normative and has goals. This is the ancient telos or purpose.

Appreciating Deacon's argument is easier with a little history. Claude Shannon's information theory produced an expression for the potential information that can be carried in a communication channel. It is the mathematical negative of Boltzmann's formula for entropy.

S = k log W

Confusingly, John von Neumann suggested that Shannon use the word entropy for his measure of information. Then Leon Brillouin coined the term negentropy to describe far-from-equilibrium conditions in the world epitomized by information. Since Erwin Schrödinger, we have known that life is impossible without the negative-entropy flow of far-from-equilibrium available energy from the sun.

Shannon entropy (which is negentropy) describes the large number of possible messages that could be encoded in a string of characters. Shannon's actual information reduces the uncertainty in the entropy of potential messages. Deacon notes correctly that new information can be transmitted only if these alternative possibilities exist. Without probability (ontological chance) and true alternative possibilities, there would be no information in the message.

"No possibilities = no uncertainty = no information," Deacon says. Without something new, the amount of information in the universe would be fixed. This is deeply true.

Organisms are not machines, and minds are not computers, says Deacon, criticizing cognitive scientists who seek a one-to-one correspondence between conscious thoughts or actions and neuronal events. Machines are assembled from parts, whereas organisms self-assemble, he insightfully observes.

Computers are designed to be totally predictable logical devices that are noise-free, but organisms and the mind could not survive if they worked that way, because the universe continually generates unpredictable new situations. The mind supervenes on astronomical numbers of neuronal events, which likely transmit far more stochastic noise than they do meaningful signals. Deacon thinks that meaningful mental events are probably only statistical regularities, averages over neuronal events, just as macroscopic classical properties are averages over quantum-level events.

Deacon's interest in the etymology of words is fascinating, but his love of symbols leads him to use neologisms that make his sentences too dense, often obscuring his excellent ideas. He does provide a glossary of his newly coined terms, but these are difficult to keep in mind while reading his text.

For example, Deacon uses homeodynamic for his first level instead of the standard term thermodynamic, which he does use occasionally and which would have been more clear. Then, instead of morphodynamic for the second level where information structures appear, he might have used negentropic (implying Shannon entropy and information creation). For his third level, teleodynamic is fine, but I'd have chosen the well-known term teleonomic, suggested by Colin Pittendrigh, and used by Ernst Mayr and Jacques Monod, whose Nobel colaureate François Jacob said that "the goal of every cell is to become two cells."
Deacon's triadic levels (compare Peirce and Hegel)
  • Homeodynamic - a dynamic process in which a system is approaching thermodynamic equilibrium - perhaps thermodynamic, which Deacon sometimes uses, would be clearer?
  • Morphodynamic - describes a system spontaneously organizing, lowering its entropy, increasing information structures
  • Teleodynamic - two morphodynamic systems, one self-organizing, the other autocatalytic, which together exhibit an internal purpose - an "end" or "telos," namely to use the flow through them of negative entropy (matter and energy), enabling them to act (pragmatically) to maintain themselves. (One might ask what exactly it is about Deacon's combination of two systems that adds the telos. Both "self"-organizing and "auto"-catalytic systems exhibit what Howard Pattee calls the self/non-self distinction or "epistemic cut.")

Deacon objects to calling his third level teleonomic, which was created explicitly to remove the theological "intelligent design" elements of the term teleological.

Deacon defines teleonomic as "teleological in name only" (see glossary below), which is odd considering the historical purpose of the term in biology, which was introduced by Colin Pittendrigh in 1958, used by Jacques Monod in his 1971 Chance and Necessity, and clarified by Ernst Mayr in his 1974 article Teleological and Teleonomic: A New Analysis, his 1988 book Toward a New Philosophy of Biology, and his 1992 article The Idea of Teleology.

What does Deacon add into his teleodynamic that goes beyond teleonomic? He defines his teleodynamic as "exhibiting end-directedness" and then adds the highly specific and technical criteria "consequence-organized features constituted by the co-creation, complementary constraint, and reciprocal synergy of two or more strongly coupled morphodynamic processes."

Deacon's major work is to model computer-based simulations of these combined morphodynamic processes to better understand their properties, so he is entitled to his technical definitions, if they are essential to his dynamical computational models.

His current major goal is to understand how his simple autogen model can combine with information theory to explain the concepts of "reference" and "significance." He variously defines reference as "aboutness" or "re-presentation," the semiotic or semantic relation between a sign-vehicle and its object. He describes significance as the pragmatic dimension of "value," "normativity," "purpose," "interpretation," "function," "usefulness," "end-directed," and "goal-state." "Work is the relevant measure when it comes to assessing the usefulness of information," he says.

Reference is the simple connection between an abstract idea (re-presentation in the mind) and its material (or conceptual) object. In linguistics, reference is the semantic connection between a word and its (dictionary) meaning. For Saussure it is independent of context or Peircean interpretation. Deacon may take a reference as not involving any physical work.
Roman Jakobson added "context" to Claude Shannon's theory of communications, which understandably ignored the "meaning" in a message to study only channel capacity.

By contrast, significance is the pragmatic or functional value of an idea or a sign/symbol when it is interpreted in context by an agent (Peirce's interpretant). The agent must act on a meaningful message (where "meaning" is now not merely the standard reference of the symbol, but what the message means in the context of the future behavior of the agent, e.g. love or hate?, flight or fight?). An action normally involves physical work, as Deacon correctly notes. And in the context of his purely dynamical, arguably inanimate, autogen, that is appropriate. But for animals and humans, pragmatic consequences may only generate internal thoughts, ideas, judgments, and emotions, feelings, desires, that generate possibilities for willful actions after a careful evaluation and decision.

In any case, Deacon is right to distinguish reference and significance (as semantics and pragmatics) and try to understand them in terms of his two morphodynamic processes. But is he right to say that teleodynamics is in some way more purposeful than a teleonomic process, a process that has its purpose "built-in," what Aristotle called "entelechy" (from the Greek en-telos-echein, in-purpose-have)?

Let's carefully read Deacon's difference (that makes a difference) between teleonomic and teleodynamic proposed for his 2015 workshop. He describes
a long-standing debate in the natural sciences over the role of teleology in scientific explanations. This debate was presumed settled in middle of the last century with the development of cybernetic models of goal-directed behavior, such as in guidance systems and adaptive computer algorithms. Systems organized in this way are described as teleonomic rather than teleologic, to indicate that no intrinsic representation of an end is responsible for this behavior, only a systemic deviation-minimizing regulatory mechanism. In contrast, we argue that an interpretive process can only be adequately defined with respect to a process that is organized so that the goal-state contributes to the maintenance of the system with the disposition to attain that state, not merely some arbitrary physical state of things. Deacon (2009, 2012) terms this a teleodynamic process.
Deacon's glossary also contains Ernst Mayr's teleomatic, which Mayr meant to single out systems that are purely mechanical and dynamical, obeying physical laws. Can this include a thermostat (Deacon's "systemic deviation-minimizing regulatory mechanism"), with the appearance of goal-directed behaviors? Deacon defines teleomatic as "Automatically achieving an end, as when a thermodynamic system develops toward equilibrium or gravity provides the end state for a falling rock."

Let's review Mayr's careful and important distinction between teleological, teleonomic and teleomatic
What is teleology, and to what extent is it a valid concept? These have been burning questions since the time of Aristotle. Kant based his explanation of biological phenomena, particularly of the perfection of adaptations, on teleology — the notion that organisms were designed for some purpose...And the numerous autogenetic theories of evolution, such as orthogenesis, nomogenesis, aristogenesis, and the omega principle (Teilhard de Chardin), were all based on a teleological world view. Indeed, as Jacques Monod (1971) rightly stressed, almost all of the most important ideologies of the past and the present are built on a belief in teleology.

It is my belief that the pervasive confusion in this subject has been due to a failure to discriminate among very different processes and phenomena, all labeled "teleological." The most important conclusion of the recent research on teleology is that it is illegitimate to extrapolate from the existence of teleonomic processes (that is, those directed or controlled by the organism's own DNA) and teleomatic processes (those resulting from physical laws) to an existence of cosmic teleology. There is neither a program nor a law that can explain and predict biological evolution in any teleological manner. Nor is there, since 1859, any need for a teleological explanation: The Darwinian mechanism of natural selection with its chance aspects and constraints is fully sufficient.

The study of genetics has shown that seemingly goal-directed processes in a living organism (teleonomic processes) have a strictly material basis, being controlled by a coded genetic program.
We must distinguish non-physical and immaterial. Information is physical but immaterial.
Deacon has now given us a specific model for the locus of the telos. He says that the first material particles, the first atoms forming molecules, the first stars, and so on, can be explained without reference to anything non-physical. But since these are formed by what he calls morphodynamic processes, they must also involve some immaterial information generation. They are information structures. Information philosophy shows that without the expansion of the universe and ontological chance arising from quantum uncertainty, no new information could have come into existence from an assumed original state of thermodynamic equilibrium. There would be no galaxies, no stars, no planets, no life, no minds, no creative new thoughts, and in particular, no telos.

Why does Deacon describe nature as incomplete? Because information seems non-physical (it is actually physical, just not material), he says, we lack a scientific understanding of how words and sentences refer to atoms of meaning. The meanings of words and thoughts, the contents of the mind — especially goals and purposes — are "not present," he says. He reifies this absence and says cryptically that "a causal role for absence seems to be absent from the natural sciences." He calls this a "figure/ground reversal" in which he focuses on what is absent rather than present, likening it to the concept of zero, the holes in the "(w)hole." We can agree with Deacon that ideas and information are immaterial, neither matter nor energy, but they need matter to be embodied and energy to be communicated. And when they are embodied, they are obviously present (to my mind) — in particular, as those alternative possibilities (merely potential information) in a Shannon communication, those possibilities that are never actualized.

A review in the journal BioScience of Deacon's Incomplete Nature.
Deacon on Information (from Incomplete Nature)
TWO ENTROPIES (see our Entropy Flows)

To the extent that regularity and constraint provide a necessary background, for deviation and absence to stand out, nature's most basic convergent regularity must provide the ultimate ground for information. This regularity is of course the spontaneous tendency for entropy to increase in physical systems. Although Rudolf Clausius coined the term entropy in 1865, it was Ludwig Boltzmann who in 1866 recognized that this could be described in terms of increasing disorder. We will therefore refer to this conception of thermodynamic entropy as Boltzmann entropy.

This reliably asymmetric habit of nature provides the ultimate background with respect to which an attribute of one thing can exemplify an attribute of something else. The reason is simple: since non-correlation and disorder are so highly likely, any degree of orderliness of things typically means that some external intervention has perturbed them away from this most probable state. In other words, this spontaneous relentless tendency toward messiness provides the ultimate slate for recording outside interference. If things are not in their most probable state, then something external must have done work to divert them from that state.

A second use of the term entropy has become widely applied to the assessment of information, and for related reasons. In the late 1940s, the Bell Lab mathematician Claude Shannon demonstrated that the most relevant measure of the amount of information that can be carried in a given medium of communication (e.g., in a page of print or a radio transmission) is analogous to statistical entropy. According to Shannon's analysis, the quantity of information conveyed at any point is the improbability of receiving a given transmitted signal, determined with respect to the probabilities of all possible signals that could have been sent. Because this measure of signal options is mathematically analogous to the measure of physical options in thermodynamic entropy, Shannon also called this measure the "entropy" of the signal source. I will refer to this as Shannon entropy to distinguish it from thermodynamic entropy (though we will later see that they are more intimately related than just by analogy).

Consider, for example, a coded communication sent as a finite string of alphanumeric characters. If each possible character can appear with equal probability at every point in the transmission, there is maximum uncertainty about what to expect. This means that each character received reduces this uncertainty, and an entire message reduces the uncertainty with respect to the probability that any possible combination of characters of that length could have been sent. The amount of the uncertainty reduced by receiving a signal is Shannon's measure of the maximum amount of information that can be conveyed by that signal.

In other words, the measure of information conveyed involves comparison of a received signal with respect to possible signals that could have been sent. If there are more possible character types to choose from, or more possible characters in the string, there will be more uncertainty about which will be present where, and thus each will potentially carry more information. Similarly, if there are fewer possible characters, fewer characters comprising a given message, or if the probabilities of characters appearing are not equal, then each will be capable of conveying proportionately less information. Shannon's notion of entropy can be made quite precise for analysis of electronic transmission of signals and yet can also be generalized to cover quite mundane and diverse notions of possible variety. Shannon entropy is thus a measure of how much information these media can possibly carry. Because it is a logical, not a physical, measure, it is widely realizable. It applies as well to a page of text as to the distribution of objects in a room, or the positions that mercury can occupy in a thermometer. Since each object can assume any of a number of alternative positions, each possible configuration of the collection of objects is a potential sign.

What is "absent" for Deacon are all the unchosen alternative possibilities, without which no new information is created. Compare Stuart Kauffman's "adjacent possibles."
Shannon's analysis of information capacity provides another example of the critical role of absence. According to this way of measuring information, it is not intrinsic to the received communication itself; rather, it is a function of its relationship to something absent — the vast ensemble of other possible communications that could have been sent, but weren't. Without reference to this absent background of possible alternatives, the amount of potential information of a message cannot be measured. In other words, the background of unchosen signals is a critical determinant of what makes the received signals capable of conveying information. No alternatives = no uncertainty = no information.[Our emphasis] Thus Shannon measured the information received in terms of the uncertainty that it removed with respect to what could have been sent.

The analogy to thermodynamic entropy breaks down, however, because Shannon's concept is a logical (or structural) property, not a dynamical property. For example, Shannon entropy does not generally increase spontaneously in most communication systems, so there is no equivalent to the second law of thermodynamics when it comes to the entropy of information. The arrangement of units in a message doesn't spontaneously "tend" to change toward equiprobability. And yet something analogous to this effect becomes relevant in the case of real physically embodied messages conveyed by real mechanisms (such as a radio transmission or a computer network. In the real world of signal transmission, no medium is free from the effects of physical irregularities and functional degradation, an unreliability resulting from the physical effects of the second law.

So both notions of entropy are relevant to the concept of information, though in different ways. The Shannon entropy of a signal is the probability of receiving a given signal from among those possible; and the Boltzmann entropy of the signal is the probability that a given signal may have been corrupted.

A transmission affected by thermodynamic perturbations that make it less than perfectly reliable will introduce an additional level of uncertainty to contend with, but one that decreases information capacity. An increase in the Boltzmann entropy of the physical medium that constitutes the signal carrier corresponds to a decrease in the correlation between sent and received signals. Although this does not decrease the signal entropy, it reduces the amount of uncertainty that can be removed by a given signal. and thus reduces the information capacity.

This identifies two contributors to the entropy of a signal — one associated with the probability of a given signal being sent and the other associated with a given signal being corrupted. This complementary relationship is a hint that the physical and informational uses of the concept of entropy are more than merely analogous. By exploring the relationship between Shannon entropy and Boltzmann entropy, we can shed light on the reason why change in Shannon entropy is critical to information. But the connection is subtle, and its relation to the way that a signal conveys its information is even subtler.