Transcript

INCOMPLETE NATURE, EMERGENCE

AND CONSCIOUSNESS

INTRODUCTION

Terrence Deacon’s 2013 book ‘Incomplete Nature: How Mind

Emerged From Matter’ has been generally greeted with acclaim

by the philosophical community (With some exceptions see

Mcginn and Fodor’s reviews). His book which attempts a

naturalistic account of how mind emerged from matter is

challenging and extremely well argued. In this blog I will try

to summarise and analyse some of the primary arguments in

Deacon’s book.

His ‘Incomplete Nature’ begins with an extremely interesting

analogy. Deacon speaks about how for many years the number

zero was banned from mathematics. Mathematicians feared zero

because of its strange paradoxical characteristics so they did

not admit it into their mathematical systems. Deacon begins

his book with the following quote:

“In the history of culture, the discovery of zero will always stand out as one ofthe greatest single achievements of the human race.” (Tobias Dantzig)

Deacon argues that a similar situation now occurs for what he

calls ententional phenomena. The word ‘ententional’ was coined

by Deacon as a generic adjective to describe all phenomena

that are intrinsically incomplete in the sense of being in

relationship to, constituted by, or organized to achieve

something non-intrinsic. This includes function, information,

meaning, reference, representation, agency, purpose,

sentience, and value (Incomplete Nature p.549). Deacon argued

that ententional phenomena like the concept zero have a

primary quality of absence. These absential features mean the

relevant phenomena are determined with respect to an absence.

So, for example our intentional states are directed towards

some state of affairs which may or not obtain so it has an

absential. These absential qualities are something that

science up until now has found it impossible to give a causal

account of. This is a serious state of affairs because

something like my ‘belief that P’ can have obvious causal

consequences, e.g. me moving my body to position x. Deacon

correctly notes that ententional phenomena are treated by

scientists in a similar way to the way mathematicians treated

zero. Scientists and philosophers treat ententional phenomena

as something spooky and paradoxical which need to be contained

or outright eliminated from our ontology. Deacon proposes that

admitting ententional phenomena into our ontology will greatly

increase our theoretical powers in science in a similar way to

the way admitting zero in to mathematics greatly increased our

calculation capacities. So the aim of his book is to admit the

reality of ententional phenomena and show how they arrived

onto the scene naturalistically. He argues that his theory can

explain how a form of causality dependent on absential

qualities can exist; he claims that this explanation is

compatible with our best science.

It is interesting to briefly compare Deacon’s view with the

views of Ladyman et al as expressed in their ‘Everything Must

Go’ which I discussed in my last blog. While Ladyman et al

treat ententional phenomena as real patterns but patterns

which do not exist in our fundamental ontology; Deacon treats

ententional phenomena in a more realistic way. So, for example

on p.38 he talks of the universe as being a causally closed

system. (All basic causal laws of the universe also follow a

closed system, all change comes from within). This seems

radically at odds with Ladyman and Ross who argue that

causality plays no role at the level of basic physics, though

it does play a role in the special sciences. However a closer

examination of Deacon’s views shows that his views on

causation are actually pretty close to those of Ladyman et al.

Deacon outlines the thesis of his book as follows:

“As rhetorically ironic as this sounds, the thesis of this book is that the answerto the age-old riddle of teleology is not provided in terms of “nothing but…” or“something more…” but rather “something less…” this is the essence of what Iam calling absentialism” (Incomplete Nature p.43)

When speaking of his explanations in term of absence he makes

claims about processes and substances which are striking

analogous to claims made by Ladyman et al:

“Showing how these apparently contradictory views can be reconciled requiresthat we rethink some very basic tacit assumptions about the nature of physicalprocesses and relationships. It requires reframing the way we think about thephysical world in thoroughly dynamical, that is to say, process, terms, andrecasting our notions of causality in terms of something like the geometry ofthis dynamics, instead of thinking in terms of material objects in motionaffected by contact and fields of force” (ibid p. 44)

He argues that while it is intuitive to think of the world in

terms of billiard ball causality, when it comes to basic

physics this notion has been abandoned in terms fields of

probability rather than discretely localizable stuff. It is

failure to overcome this billiard ball conception of stuff

which leads to people to think that something must be added to

make a mind.

He argues that our ultimate scientific challenge is to

precisely characterize the geometry of dynamical from

thermodynamic processes to living mental processes, and to

explain their dependency relationships with respect each

other. It is worth at this point considering Deacon’s

criticisms of the metaphysical views of Jaegwon Kim.

DEACON ON EMERGENTISM AND KIM

Deacon argues that the reductionism which began with

Democritus and produced many successful research programmes,

for example, the reduction of chemistry to physics, is

favoured by most contemporary theorists. This view has lead a

lot of philosophers to think that smaller is more fundamental

(philosopher Thomas Wilson dubbed this “smallism”). Deacon

correctly notes that this view is not necessarily true:

“It is not obvious, however, that things do get simpler with a

descent in scale, or that there is some ultimate smallest unit

of matter, rather than merely a level of scale below which it

is not possible to discern differences.” (Incomplete Nature

p.153)

Nonetheless he points out that a lot of science has been

successful as a result of thinking of objects in terms of

their component parts. He claims that, despite the success

that of thinking of objects as made of component parts has had

in science; it has the difficulty that it focuses attention

away from the contributions of interaction complexity. This,

he complains, suggests that investigating the organisational

features of things is less important than investigating its

component properties.

When discussing emergence Deacon gives Sperry’s example of how

consciousness is a product of certain configurations of matter

which are only found in the brain. Though this is not a new

type of matter the configuration can have certain causal

consequences that other configurations of matter do not have.

Sperry uses the analogy of how a wheel is a particular

configuration of matter. This configuration does not involve

new stuff but it does have causal consequences that are

unexpected; i.e. the wheel can move in ways that other

configurations of matter cannot. So Deacon thinks we can have

emergentism without lapsing into any kind of mysterious

dualism.

He notes that Jaegwon Kim is generally considered to have

refuted certain forms of emergentist theories. He sums up

Kim’s argument as follows:

“Assuming that we live in a world without magic, and that all composite entitieslike organisms are made of simpler components without residue, down tosome ultimate elementary particles, and assuming that physical interactionsultimately require that these constituents and their causal powers (i.e. physicalproperties) are the necessary substrate for any physical interaction, thenwhatever causal properties we ascribe to higher-order composite entities mustultimately be realized by these most basic physical interactions. If this is true,then to claim that the cause of some state or event arises at an emergenthigher level is redundant. If all higher-order causal interactions are betweenobjects constituted by relationships among these ultimate building blocks ofmatter, then assigning causal power to various higher-order relations is to dobookkeeping. It’s all just quarks and gluons- or pick your favourite smallestunit- and everything else is just a gloss or descriptive simplification of whatgoes on at that level. As Jerry Fodor describes it, Kim’s challenge toemergentists is: why is there anything except physics?” (Terrance Deacon‘Incomplete Nature’ 2013 p.165)

Deacon claims that Kim’s challenge can be attacked from the

point of view of contemporary physics. He argues that the

substance metaphysics which Kim uses to support his

mereological analysis is not supported by quantum physics.

Deacon’s point is similar to a point made by Ladyman et. al in

their ‘Everything Must Go’, while Tim Maudlin has made similar

points in his ‘The Metaphysics Within Physics’. The problem

which all those theorists note is that there are not any

ultimate particles or simple ‘atoms’ devoid of lower level

compositional organisation on which to ground unambiguous

higher-level distinctions of causal power (ibid p.167).

On page 31 of their ‘Every Thing Must Go’ Ladyman et al. also

attack Jagewon Kim’s 1998 book ‘Mind and World’ for claiming

to be a defence of physicalism despite the fact that it

doesn’t engage with any contemporary physics. They note that

there are no physics papers or physics books even cited by Kim

in his book. This is besides the fact that Kim’s arguments

rely on un-trivial assumptions about how the physical world

works (ETMG p.31). According to Ladyman et al. Kim defines a

‘micro-based property’ which involves the property of being

‘decomposable into non-overlapping proper parts’ (Kim 1998 p.

84 (quote taken from ETMG). This assumption does a lot of

Kim’s work in defending his physicalism, however as we know

from Quantum Entanglement, micro-components of reality are not

decomposable in this way. They explicate their criticism of

Kim as follows:

“Kim’s micro-based properties, completely decomposable into

non-overlapping proper parts, and Lewis’s ‘intrinsic

properties of points-or of any point sized occupants of

points’, both fall foul of the non-separability of quantum

states, something that has been a well-established part of

micro-physics for generations” (ETMG p. 32)

Ladyman et al are attacking Kim’s metaphysics because it is

done without regard to our best fundamental physics. As we can

see their criticism of Kim is entirely in agreement with

Deacon’s criticism. Deacon, however is not merely criticising

Kim for ignoring contemporary quantum physics in metaphysics.

He is making the further claim that contemporary quantum

physics shows that Kim’s arguments against reductionism do not

work.

Deacon correctly argues that in quantum physics when we get to

the lowest level we have quantum fields which are not

dividable into point particles. He notes that Quantum fields

have ambiguous spatio-temporal origins, have extended

properties that are only statistical and dynamically

definable, etc. So given the fact that quantum world behaves

in this way Kim’s analysis into mereological (part/whole)

features fails

Kim’s in effect argued that since at bottom it is all just

quarks and gluons, and that everything else is just a

descriptive simplification of what goes on at that level, then

we are forced to ask, why is there anything other than

physics? Deacon notes that the emergentism which Kim is

criticising is connected to the idea of supervenience which

argues that “There cannot be two events exactly alike in all

physical respects but differing in some mental respects, or

that an object cannot alter in some mental respects without

altering in some physical respect” (Davidson 1970). People who

argue for emergence, have to account for how something can

emerge which is entirely dependent on the physical; but is not

reducible to it. Kim thinks that his argument shows that this

type of emergence is not possible.

Kim in effect argues that since there can be no differences in

the whole without differences in the parts then emergence

theories cannot be true. Kim claims that in our theory we want

to avoid double-counting. So if we map all causal powers to

distinctive non-overlapping parts of things this leaves no

room to find them uniquely emergent, in aggregates of these

parts, no matter how they are organised (Incomplete Nature

p.167)

Now Deacon replied to this by criticising Kim for not keeping

up with developments in quantum mechanics. Philosopher Paul So

made the following reply to Deacon:

“What I do remember is that Kim’s problem with Emergentism is that emergentphenomena like mental states may not have causal powers. Specifically, Ifmental states supervene (or ontologically depend) on physical ones, then itsreally physical ones that do all the causal work, whereas mental states justappear to have causal powers. I agree that Kim may need to learn morequantum mechanics to substantiate his claim, but I don’t think Deacon’sobjection really refutes his primary concern; I don’t think Kim’s assumptionabout particles being fundamental constituents of the physical world isessential to his concern. What is essential to his concern is how an emergentphenomena like mental states can do any causal work if its really thephenomena from fundamental physics that do all the work. We could justreplace particles with quantum fields and his concern could still stand. Ifemergent phenomena such as our mental states ontologically depend onquantum fields, then it appears that quantum fields do all the causal work(though in a statistical manner).” (Paul So personal communication)

Now I should first note that Deacon’s concern is with the fact

that Kim’s argument is dependent on a part-whole analysis, and

this part/whole analysis is simply impossible at the quantum

level. It is only at the macro-level that we can do the type

of mereological analysis that Kim suggests. Deacon discusses

the philosopher Mark Bickhard who argues against Kim:

“A straight forward framing of this challenge to a mereological conception ofemergence is provided by cognitive scientist and philosopher Mark Bickhard.His response to this critique of emergence is that the substance metaphysicsassumption requires that at the base, particles participate in organisation butthey do not themselves have organisation. But, he argues, point particleswithout organisation do not exist because real particles are the somewhatindeterminate loci of inherently oscillatory quantum fields. These are

irreducibly process like and thus by definition organised. But if processorganization is the irreducible source of the causal properties at this level, thenit cannot be delegitimated as a potential locus of causal power withouteliminating causality from the world. It follows that if the organisation of aprocess is the fundamental source of its causal power, then fundamentalreorganisations of process, at whatever level this occurs, should be associatedwith a reorganisation of causal power as well” (ibid p.168)

The above quote shows why Paul’s objection doesn’t refuteDeacon’s argument. However, it is worth noting that simplybecause, Kim hasn’t refuted Deacon this doesn’t vindicateemergentism; rather he merely shows that a particularargument does not work. He will need a much more extensivediscussion of the philosophers who reject emergence if hispositive thesis is to be sustained.DEACON ON AUTOGENS“The reciprocal complementarity of these two self-organizing processes creates the potential for self-repair, self-reconstitution, and even self-replication ina minimal form” (Incomplete Nature p. 306)So what we are looking at here is a co-facilitation ofmorphodynamic processes. By an autogen Deacon means thewhole class of minimally related teleodynamic systems.Something is an autogen if it is a simple dynamical systemthat achieves self-generation by harnessing the co-dependent reciprocity of component morphodynamicprocesses. He notes that though considered in isolation orco-dependent, together they are reciprocally self-limiting, so that their self-undermining features arereciprocally counteracted. As a result an autogenic systemwill establish its capacity to re-form before exhaustingits substrates, so long as closure is completed beforereaching this point (ibid p.308). The two processesprovide boundary conditions for both providing asupporting environment for each other.Deacon also argues that an autogen can also reproduce. Heargues that fractured components of a disrupted autogenwill be able to create new autogens by the same processwith which the autogen was created. In trying to thinkthrough how the first life arose from non-life Deaconbegins with the notion of Autocatalysis. Typically aclosed molecular system will tend toward some steady

state, with fewer and fewer chemical reactions occurringover time, as the overall distribution of reactions runsin directions that offset each-other – this is the secondlaw of thermodynamics at work (ibid p. 293). Deacon notesthat with chemical systems maintained far fromequilibrium, where the conditions for asymmetric reactionprobabilities are not reduced, non-equilibrium dynamicscan produce some striking features (ibid p.293). Deaconclaims that the most relevant classes of non-equilibriumchemical process is autocatalysis.A Catalysis is a molecule that because of its allostericgeometry and energetic characteristics increases theprobability of some other chemical reaction taking placewithout itself being altered in the process. (ibid p.293), hence it introduces a thermodynamic element into achemical reaction as a consequence of its shape withrespect to other molecules.An Autocatalysis is a special case of catalytic reactionsin which a small set of catalysts each augment theproduction of another member of the set, so thatultimately all members of the set are produced. This hasthe effect of creating a runaway increase of the moleculesof the autocatalytic set at the expense of other molecularforms, until all substrates are exhausted. Autocatalysisis thus briefly a self-amplifying chemical process thatproceeds at ever-higher rates, producing more of the samecatalysts with every iteration of the reaction. Deaconnotes that according to some people autocatalysts areextremely rare (though Stewart Kauffman has argued thatthey are not a rare as believed). In fact Kauffman hasargued that Autocatalysis is inevitable under not tooextreme conditions. Manfred Eigen has studied hyper-cycleswhere autocatalysts lead to more autocatalysts. Deaconnotes that obviously for such autocatalysts to occur weneed a rich substrate to keep things going. When the rawmaterials are used up the interdependent catalystsdissipates. So he argues Autocatalysis is SELF-PROMOTING,but not SELF-REGULATING or SELF MAINTAINING. Thesecatalytic networks of molecular interactions characterizethe metabolic processes of living cells, so it is far fromimpossible.Deacon begins his discussion of evolution by quoting from

Batten, et al’s 2009 paper ‘Visions of Evolution: SelfOrganisation Proposes What Natural Selection Disposes.’.Deacon argues that while a lot of evolutionary theorytells the story of random mutations and natural selectionbeing the primary story in evolution, there is growingevidence that self organisation plays as big a role asrandom mutations. Deacon correctly argues that strictlyspeaking the theory of evolution is substrate neutral(hence a precursor to functionalism), so it doesn’t reallymatter to selection whether it is working on a change as aresult of self-organizing process or a change as a resultof random mutation.He argues that this substrate neutrality has clearimplications for emergence. If like Dennett we think ofevolution as an algorithmic process that can beimplemented on various different machines we can therelevance of it for emergence. When a particularfunctional organisation is selected over thousands uponthousands of years the same physical process is notnecessarily used all of the time. He likens this the waythat 2+2=4 can be calculated using different devices, ahuman brain, a calculator, a page, fingers, etc. Likewisean adaptation is not identical to the collection ofproperties constituting the specific mechanism thatconstitutes it (ibid p.425). In some ways it is less thanthe collection of properties, as only some of theproperties are relevant to the success of an adaptation.While in some ways an adaptation is more than itsproperties, as it is the consequence of an extendedhistory of constraints being passed from generation togeneration (constraints that are the product of manydifferent substrates) (ibid. p.425)Later Deacon describes Evolution as follows:“Evolution in this sense can be thought of as a process ofcapturing, taming and integrating diverse morphodynamicprocesses for the sake of their collective preservation.”(ibid p. 427)He gives some clear examples of morphodynamic processes:(1) Whirlpools, (2) Convection Cells, (3) Snow Crystalgrowth (4) Fibonacci structures in inorganic matter.And he argues that the above morphodynamic processes areinteresting because:

“…What makes all these processes notable, and motivatesthe prefix morpho- (form), is that they are processes thatgenerate regularity not in response to the extrinsicimposition of regularity, or by being shaped by anytemplate structure, but rather by virtue of regularitiesthat are amplified internally via interaction dynamicsalone under the influence of persistent externalperturbations.” (ibid p.242)When discussing Autogens Deacon does seem to be invoking akind of intrinsic intentionality when cashing out what hebelieves are the earliest forms of information (in hissense) using Autogens. Deacon argues that early autogenshad information (in the sense of aboutness) because aparticular form of chemical bond was sensitive to whatenvironment was good for it. This is a strange argumentand is worth looking at closely.“They argue that an autogeneic system in which itscontainment is made more fragile by the bonding ofrelevant substrate molecules to its surface could beconsidered to respond selectively to its environment”(ibid p.443)Here the use of the word information (as in aboutness) isextremely strange. In what sense could the bonding ofsubstrate molecules to the surface of an autogen conveyinformation? Deacon notes that this bonding (or lackthereof ) will have consequences for whether particularautogens survive in particular environments.So, for example, if the autogen’s containment is disruptedin the context where the substrates that supportautocatalysis are absent or of low concentration, re-enclosure will be unlikely. So stability of containment isadvantageous for persistence of a given variant incontexts where the presence of relevant substrates is oflow probability. He correctly notes that this is anadaptation. (So different autogens are selected bydifferent environments). While these autogens cannotinitiate their own reproduction (their differentialsusceptibility to disruption with respect to relevantcontext is a step in that direction) (ibid p.443)“It seems to me that at this stage we have introduced anunambiguous form of information and its interpretativebasis. Binding of relevant substrate molecules is

information about the suitability of the environment forsuccessful replication; and since successful replicationsincreases the probability that a given autogenic form willpersist, compared to other variants with less success atreplication, we are justified in describing this asinformation about the environment for the maintenance ofthis interpretative capacity.” (ibid p. 443)I suppose one could say that it is indeed information inBateson’s sense (a difference that makes a difference) butthere is nothing at that point that HAS the information,we see the information as theorists, but the informationis not grasped by anything. We could view the informationas a real-pattern that exists whether or not there isanyone around to observe it.Deacon then asks the obvious question: what is thedifference between the information which the autogen has,and the information of say a thermostat? He says that inthe absence of a human designer a thermostat only hasShannon Information. His argument is that while thingslike thermostats provide information about particularaspects of their environments this is not a significant asit may seem. A wet towel can provide information about thetemperature of a room, or the tracking of dirt in from thestreet has the capacity to indicate that somebody hasentered the room. He notes that while things like wettowels can provide information as to the temperature ofthe room, there an infinite number of other things thatthe wet towel is informative about depending on theinterpretative process used. He argues that that is theKEY DIFFERENCE. What information the physical processprovides depends on the idiosyncratic needs and interestsof the interpreter. So, he claims, that there is noINTRINSIC END DIRECTEDNESS to these mechanisms or physicalprocesses.He makes the startling claim that this INTRINSIC ENDDIRECTEDNESS is provided by autogens tendency toreconstitute itself after being disrupted. This seems likethe dreaded teleology being introduced for even simplenon-living autogens. His argument involves noting that theautogen is tending towards a specific target state, italso tends to develop towards a state that reconstitutesand replicates this tendency as well. This tendency has

normative consequences, autogens can receivemisinformation (bind with other molecules that are notcatalytic substrates and will weaken it). Such autogenslineages will tend to be wiped out. So where there isinformation (non-shannonian) there is the possibility oferror (because aboutness is an extrinsic relationship andso is necessarily fallible).I am not sure what to make of his claims here to behonest. For me Dennett put paid to the idea of intrinsicaboutness as far back as his (1987) ‘Error, Evolution, andIntentionality’ where he attacks Fodor’s notion ofintrinsic intentionality. It seems to me that hisarguments go through against Deacon without modification.If Deacon is arguing that the normativity of theseautogens is a ‘real-pattern’ I would have no problem but Iam having trouble understanding his idea of INTRINSIC ENDDIRECTEDNESS.Deacon not only speaks of autogens having relations ofaboutness to the world, he goes further and argues thatautogens have a type of consciousness. He even argues thatindividual neurons in our brain are conscious:“The central claim of this analysis is that sentience is a typical emergentattribute of any teleodynamic system… A neuron is a single cell, and simpler inmany ways than almost any other single-cell eukaryotic organisms, such as anamoeba. But despite its dependence on being situated within a body andwithin a brain, and having its metabolism constantly tweaked by signalsimpinging on it from hundreds of other neurons, in terms of the broaddefinition of sentience I have described above, neurons are sentient agents”(Ibid pp.509)

This claim of course means that for Deacon an autogen is

conscious. Now in above we noted that for Deacon an autogen

has intrinsic end directedness because there are aspects of

its environment that are good for its survival and

reproduction. Deacon has here offered an interesting proposal

as to how consciousness and aboutness emerges that manages to

avoid panpsychism, or the exceptionalism which claims that

only humans are consciousness and it magically emerges for us

and us alone, or eliminativism which says that experience does

not exist. However, I don’t think he has really offered any

evidence to support his theory over the theories of his

rivals. Though it is certainly an interesting proposal that if

researched further could lead to answers.

CONSCIOUSNESS AND PAIN

As Deacon’s key exemplar of consciousness he focuses on Pain

and Suffering. I think this is an interesting and useful

starting point. In Dennett 1978 ‘Why you can’t make a computer

that feels pain’, he tried to discuss whether it would be

possible to build a computer that feels pain. Dennett’s

conclusion was that you couldn’t build a computer that feels

pain because the ordinary language conception of pain is

radically incoherent. So if we try to model pain (as a

coherent phenomena) we will have to radically modify the

ordinary language concept, once this concept is so modified

that we can model it in a computer, it will not share all of

the features that we typically associate with pain. So it is

no good a theorist saying that a computer does not experience

pain because intuitively I think that pain must have

properties x and y, because we know that people’s intuitions

on the nature of pain form a radically incoherent set. Most

people upon reading Dennett respond that an ESSENTIAL feature

of pain is its intrinsic horrible feature. So since there is

no evidence that computers feel pain in this ESSENTIAL sense

then we cannot build a computer that feels the essential

characteristic of pain its awfulness. Now Dennett could reply

to this by arguing that pain has no essential features. He

could point to congenital asymbolia where people feel pain but

do not mind the pain (hence don’t find it intrinsically

awful), to show that pain does not have an essential feature

of awfulness. Likewise he could point to the fact that people

who have congenital analgesia do not experience pain. Now what

is interesting about this disorder is that people who suffer

from it typically injure themselves at night because they do

not adjust their body when asleep. Now people without the

disorder typically adjust their bodies if in an uncomfortable

position (even while asleep), which indicates that we can feel

pain even while asleep. It is hard to know what to make of

these claims but Dennett notes that the fact that we can have

unconscious pains, and pain which doesn’t feel awful indicates

that we should not comfortably speak of essential features of

pain. Hence he thinks we should not let our intuitions of the

supposedly essential features of pain being un-modelable in a

computer too seriously. Deacon agrees with some of Dennett’s

opponents that we cannot have a computer that feels pain. But

Deacon’s reasons are not mere reliance on ones intuitions

about the nature of pain. Deacon describes emotion as follows:

“In the previous section, we identified the experience of

emotion with the tension and work associated with employing

metabolic means to modify morphodynamics. It was noted that

particularly in cases of life-and-death contexts,

morphodynamic change must be instituted rapidly and

extensively, and this requires extensive work at both the

homeodynamic (metabolic) and morphodynamic (mental) levels.

The extent of this work and the intensity of the tension

created by the resistance of dynamical processes to rapid

change, is I submit experienced as the intensity of emotion”

(Incomplete Nature p. 527)

So for Deacon for a machine to experience emotion it needs

work (the tension associated with fighting against the 2nd law

of thermodynamics), so a machine in the sense of Dennett 1978

(which is just a description of a physical regularity); cannot

experience emotion.

Of course Daniel Dennett today may has moved closer to

Deacon’s views, as can be seen in his talk ‘If Brains are

Computers what Kind of Computer are

they?’ http://youtu.be/OlRHd-r2LOw . Dennett’s talk of selfish

neurons forming a collation with each other is very close to

Deacon’s views on the nature of neuronal activation.

Throughout the book Deacon makes claims about the nature of

possibility that need to be thought through very carefully as

it is hard to square with physics, biology, or anything else;

is he arguing for a possible world type scenario? Does his

view commit him to modal realism a la Lewis? Or is it the type

of possible world proposed by Kripke? In ETMG Ladyman et al

discuss modal realism one wonders whether Deacon’s (who is a

naturalist) is defending a similar type of modal realism as

them. Unfortunately though one of the central ideas in the

book is that what doesn’t happen, or could have happened but

didn’t effects the physical world, Deacon didn’t really

explicate what he means by possibility. So I need to think

through what the metaphysics of possibility implied by his

thesis really amounts to. Thinking through and comparing

Williamson’s views on the nature of possibility as sketched in

his new book ‘Modality as Metaphysics’, with Ladyman et al’s

OSR and Deacon’s Incomplete Nature may be helpful.

In this blog I have merely described some of Deacon’s main

arguments to defend his position on how mind emerged from

matter, and gestured towards some strengths and weakness in

his view. In my next blog I will discuss his conception of

information, and modality in detail and compare his views with

those of Ladyman et al.

I think that his book is a useful starting point in helping us

think through these complex issues. Though aside from the need

to factor thermodynamics into our computational theories of

how the mind and life work, I think that he didn’t provide

sufficient evidence to support his claims of Autogens having

end-directedness or sentience. That said he book represents an

interesting attempt to deal with some very old and intractable

problems.