36
Published in ArXiv http://arxiv.org/abs/0812.4852 http://commonsense.carlhewitt.info September 26, 2009 Page 1 of 36 Common sense for concurrency and strong paraconsistency using unstratified inference and reflection Carl Hewitt http://carlhewitt.info This paper is dedicated to John McCarthy. Abstract This paper develops a strongly paraconsistent formalism (called Direct Logic™) that incorporates the mathematics of Computer Science and allows unstratified inference and reflection using mathematical induction for almost all of classical logic to be used. Direct Logic allows mutual reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems thereby overcoming the limitations of the traditional Tarskian framework of stratified metatheories. Gödel first formalized and proved that it is not possible to decide all mathematical questions by inference in his 1 st incompleteness theorem. However, the incompleteness theorem (as generalized by Rosser) relies on the assumption of consistency! This paper proves a generalization of the Gödel/Rosser incompleteness theorem: a strongly paraconsistent theory is self-provably incomplete. However, there is a further consequence: Although the semi-classical mathematical fragment of Direct Logic is evidently consistent, since the Gödelian paradoxical proposition is self-provable, every reflective strongly paraconsistent theory in Direct Logic is self-provably inconsistent! This paper also proves that Logic Programming is not computationally universal in that there are concurrent programs for which there is no equivalent in Direct Logic. Consequently the Logic Programming paradigm is strictly less general than the Procedural Embedding of Knowledge paradigm. Thus the paper defines a concurrent programming language ActorScript™ (that is suitable for expressing massive concurrency in large software systems) meta-circularly in terms of itself. Contents Introduction ............................................................................ 2 Limitations of First Order Logic ............................................. 2 Inconsistency is the Norm in Large Software Systems ............ 3 Consistency has been the bedrock of mathematics .................. 4 Paraconsistency has been around for a while. So what‟s new? 4 Direct Logic ........................................................................... 5 Direct Logic is based on argument rather than truth ............ 6 Syntax of Direct Logic ....................................................... 6 Soundness, Faithfulness, and Adequacy ............................. 7 Self-refutation (Direct Indirect Inference)........................... 8 Booleans ............................................................................ 8 Two-way Deduction Theorem ............................................ 9 Boolean equivalences ....................................................... 10 Disjunction Introduction by Negation ............................... 10 Additional principles ........................................................ 10 Direct Logic uses strong paraconsistency to facilitate theory development ..................................................................... 10 Unstratified Reflection is the Norm....................................... 11 Abstraction and Reification .............................................. 11 Diagonal Argument .......................................................... 12 Logical Fixed Point Theorem ........................................... 12 Disadvantages of stratified metatheories ........................... 12 Reification Reflection ....................................................... 13 Incompleteness Theorem for Theories of Direct Logic ..... 14 Inconsistency Theorem for Theories of Direct Logic ........ 15 Consequences of Logically Necessary Inconsistency ........ 16 Concurrency is the Norm ...................................................... 16 Nondeterministic computation .......................................... 16 Computation is not subsumed by logical deduction .......... 18 Arrival order indeterminacy.............................................. 19 Computational Representation Theorem ........................... 19 Concurrency requires unbounded nondeterminism ........... 20 Unbounded nondeterminism in an Actor programming language ........................................................................... 21 Scientific Community Metaphor ....................................... 22 The admission of logical powerlessness ................................ 22 Work to be done ................................................................... 23 Conclusion............................................................................ 23 Acknowledgements............................................................... 25 References ............................................................................ 26 Appendix. Additional Principles of Direct Logic ................. 33 Relevance Logic ............................................................... 33 Equality ............................................................................ 33 Nondeterministic λ-calculus ............................................. 33 Set Theory ........................................................................ 34 Noncompactness............................................................... 34 Direct Logic is based on XML.......................................... 34 Provably Inference Reflected Propositions in Theories of Direct Logic ..................................................................... 35

Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

Published in ArXiv http://arxiv.org/abs/0812.4852 http://commonsense.carlhewitt.info

September 26, 2009 Page 1 of 36

Common sense for concurrency and strong paraconsistency

using unstratified inference and reflection

Carl Hewitt

http://carlhewitt.info

This paper is dedicated to John McCarthy.

Abstract

This paper develops a strongly paraconsistent formalism (called Direct Logic™) that incorporates the mathematics of Computer Science and allows unstratified inference and reflection using mathematical induction for almost all of classical logic to be used. Direct Logic allows mutual reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems thereby overcoming the limitations of the traditional Tarskian framework of stratified metatheories. Gödel first formalized and proved that it is not possible to decide all mathematical questions by inference in his 1st incompleteness theorem. However, the incompleteness theorem (as generalized by Rosser) relies on the assumption of consistency! This paper proves a generalization of the Gödel/Rosser incompleteness theorem: a strongly paraconsistent theory is self-provably incomplete. However, there is a further consequence: Although the semi-classical mathematical fragment of Direct Logic is evidently consistent, since the Gödelian paradoxical proposition is self-provable, every reflective strongly paraconsistent theory in Direct Logic is self-provably inconsistent! This paper also proves that Logic Programming is not computationally universal in that there are concurrent programs

for which there is no equivalent in Direct Logic. Consequently

the Logic Programming paradigm is strictly less general than

the Procedural Embedding of Knowledge paradigm. Thus the paper defines a concurrent programming language

ActorScript™ (that is suitable for expressing massive

concurrency in large software systems) meta-circularly in

terms of itself.

Contents Introduction ............................................................................ 2 Limitations of First Order Logic ............................................. 2 Inconsistency is the Norm in Large Software Systems ............ 3 Consistency has been the bedrock of mathematics .................. 4 Paraconsistency has been around for a while. So what‟s new? 4 Direct Logic ........................................................................... 5

Direct Logic is based on argument rather than truth ............ 6 Syntax of Direct Logic ....................................................... 6 Soundness, Faithfulness, and Adequacy ............................. 7 Self-refutation (Direct Indirect Inference) ........................... 8 Booleans ............................................................................ 8 Two-way Deduction Theorem ............................................ 9 Boolean equivalences ....................................................... 10 Disjunction Introduction by Negation ............................... 10 Additional principles ........................................................ 10 Direct Logic uses strong paraconsistency to facilitate theory development ..................................................................... 10

Unstratified Reflection is the Norm....................................... 11 Abstraction and Reification .............................................. 11 Diagonal Argument .......................................................... 12 Logical Fixed Point Theorem ........................................... 12 Disadvantages of stratified metatheories ........................... 12 Reification Reflection ....................................................... 13 Incompleteness Theorem for Theories of Direct Logic ..... 14 Inconsistency Theorem for Theories of Direct Logic ........ 15 Consequences of Logically Necessary Inconsistency ........ 16

Concurrency is the Norm ...................................................... 16 Nondeterministic computation .......................................... 16 Computation is not subsumed by logical deduction .......... 18 Arrival order indeterminacy .............................................. 19 Computational Representation Theorem ........................... 19 Concurrency requires unbounded nondeterminism ........... 20 Unbounded nondeterminism in an Actor programming

language ........................................................................... 21 Scientific Community Metaphor ....................................... 22

The admission of logical powerlessness ................................ 22 Work to be done ................................................................... 23 Conclusion ............................................................................ 23 Acknowledgements............................................................... 25 References ............................................................................ 26 Appendix. Additional Principles of Direct Logic ................. 33

Relevance Logic ............................................................... 33 Equality ............................................................................ 33 Nondeterministic λ-calculus ............................................. 33 Set Theory ........................................................................ 34 Noncompactness............................................................... 34 Direct Logic is based on XML .......................................... 34 Provably Inference Reflected Propositions in Theories of

Direct Logic ..................................................................... 35

Page 2: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 2 of 36

Introduction “But if the general truths of Logic are of such a

nature that when presented to the mind they at

once command assent, wherein consists the

difficulty of constructing the Science of Logic?”

[Boole 1853 pg 3]

Our lives are changing: soon we will always be online. (If you have doubts, check out the kids and the VPs of major corporations.) Because of this change, common sense must adapt to interacting effectively with large software systems just as we have previously adapted common sense to new technology. Logic should provide foundational principles for common sense reasoning about large software systems.

John McCarthy is the principal founding Logicist of Artificial Intelligence although he might decline the title.1 Simply put the Logicist Programme is to express knowledge in logical propositions and to derive information solely by classical logic inferences. Building on the work of many predecessors [Hewitt 2008d], the Logicists Bob Kowalski and Pat Hayes extended the Logicist Programme by attempting to encompass programming by using classical mathematical logic as a programming language.

This paper discusses three challenges to the Logicist Programme:

1. Inconsistency is the norm and consequently

classical logic infers too much, i.e., anything and

everything. The experience (e.g. Microsoft, the

US government, IBM, etc.) is that inconsistencies

(e.g. among implementations, documentation, and

use cases) in large software systems are pervasive

and despite enormous expense have not been

eliminated.

Standard mathematical logic has the problem

that from inconsistent information, any conclusion

whatsoever can be drawn, e.g., “The moon is

made of green cheese.” However, our society is

increasingly dependent on these large-scale

software systems and we need to be able to reason

about them. In fact professionals in our society

reason about these inconsistent systems all the

time. So evidently they are not bound by classical

mathematical logic.

2. Unstratified inference and reflection are the

norm and consequently logic must be extended to

use unstratified inference and reflection for

strongly paraconsistent theories. However, the

traditional approach (using the Tarskian

framework of hierarchically stratified

metatheories) is unsuitable for Software

Engineering because unstratified direct and

1 Logicist and Logicism are used in this paper for the general

sense pertaining to logic rather than in the restricted technical

sense of maintaining that mathematics is in some important sense

reducible to logic.

indirect mutual reference pervades reasoning

about use cases, documentation, and code.

3. Concurrency is the norm. Logic Programs based

on the inference rules of mathematical logic are

not computationally universal because the

message order arrival indeterminate computations

of concurrent programs in open systems cannot be

deduced using mathematical logic. The fact that

computation is not reducible to logical inference

has important practical consequences. For

example, reasoning used in Semantic Integration

cannot be implemented using logical inference

[Hewitt 2008a].

Large software systems are becoming increasingly

permeated with inconsistency, unstratified inference and

reflection, and concurrency. As these inconsistent

reflective concurrent systems become a major part of

the environment in which we live, it becomes an issue

of common sense how to use them effectively. This

paper suggests some principles and practices.

Limitations of First Order Logic

“A foolish consistency is the hobgoblin of little minds.”

---Emerson [1841]

First Order Logic is woefully lacking for reasoning about

large software systems.

For example, a limitation of classical logic for

inconsistent theories is that it supports the principle that

from an inconsistency anything can be inferred, e.g. “The

moon is made of green cheese.”

For convenience, I have given the above principle the

name IGOR for Inconsistency in Garbage Out Redux.2

IGOR can be formalized as follows in which a

contradiction about a proposition Ω infers any

proposition:3

Ω, ¬ Ω ├

The IGOR principle of classical logic may not seem very

intuitive! So why is it included in classical logic?

The IGOR principle is readily derived from the following

principles of classical logic:

Full indirect inference: (├ , ) ⇨ (├ )

which can be justified in classical logic on the

grounds that if Ψ infers a contradiction in a consistent

theory then Ψ must be false. In an inconsistent

2 In Latin, the principle is called ex falso quodlibet which

means that from falsity anything follows. 3 Using the symbol ├ to mean “infers in classical mathematical

logic” and ⇨ to mean classical mathematical logical implication.

Also ⇔ is used for logical equivalence, i.e., “if and only if”.

Page 3: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 3 of 36

theory, full indirect inference leads to explosion by

the following derivation in classical logic by a which

a contradiction about P infers any proposition :

P, ¬P ├ ¬ ├ P, ¬P ├ (¬ ¬) ├

Disjunction introduction: (Ψ ├ (ΨΦ)) which in

classical logic would say that if Ψ is true then

(ΨΦ)) is true regardless of whether Φ is true.4 In

an inconsistent theory, disjunction introduction leads

to explosion via the following derivation in classical

logic in which a contraction about P infers any

proposition :

P,¬ P ├ (P),¬ P ├

Other limitations of First Order Logic include:

It lacks reflection so it can‟t deal with mutually

reflective propositions, e.g., among

documentation, uses cases, and implementations

of large software systems. Also it is stratified,

meaning that different theories cannot mutually

refer to each other‟s inferences. In particular a

theory cannot directly reason about itself.

It doesn‟t handle the mathematical induction

needed for inferring properties of programs. Nor

does it handle reasoning about contention in

concurrency.

The plan of this paper is as follows:

1. Solve the above problems with First Order Logic

by introducing a new system called Direct Logic5

for large software systems.

2. Demonstrate that no Logicist system is

computationally universal (not even Direct Logic

even though it is evidently more powerful than

any logic system that has been previously

developed). I.e., there are concurrent programs

for which there is no equivalent Logic Program.

4 Disjunction introduction follows from the principle of

absorption that Ψ is equivalent to Ψ(ΨΦ). Consequently

absorption must be limited to one direction. 5 Direct Logic is called “direct” due to considerations such as the

following:

Direct Logic does not incorporate general indirect

proof in a theory T. Instead it only allows “direct”

forms of indirect proof in the form of self-refutation,

e.g., (├T

) ├T

(├T

). See discussion below.

In Direct Logic, paraconsistent theories speak directly

about their own provability relation rather than having

to resort to indirect propositions in a meta-theory.

Inference of Φ from Ψ in a theory T (Ψ├T

Φ) is

“direct” in the sense that it does not automatically incorporate the contrapositive i.e., it does not

automatically incorporate (Φ ├T

Ψ). See discussion

below.

3. Discuss the implications of the above results for

common sense.

Inconsistency is the Norm in Large Software

Systems

“find bugs faster than developers can fix them

and each fix leads to another bug”

--Cusumano & Selby 1995, p. 40

The development of large software systems and the

extreme dependence of our society on these systems have

introduced new phenomena. These systems have pervasive

inconsistencies among and within the following:

Use cases that express how systems can be used

and tested in practice

Documentation that expresses over-arching

justification for systems and their technologies

Code that expresses implementations of systems

Adapting a metaphor6 used by Karl Popper for science, the

bold structure of a large software system rises, as it were,

above a swamp. It is like a building erected on piles. The

piles are driven down from above into the swamp, but not

down to any natural or given base; and when we cease our

attempts to drive our piles into a deeper layer, it is not

because we have reached bedrock. We simply pause when

we are satisfied that they are firm enough to carry the

structure, at least for the time being. Or perhaps we do

something else more pressing. Under some piles there is

no rock. Also some rock does not hold.

Different communities are responsible for constructing,

evolving, justifying and maintaining documentation, use

cases, and code for large, human-interaction, software

systems. In specific cases any one consideration can trump

the others. Sometimes debates over inconsistencies among

the parts can become quite heated, e.g., between vendors.

In the long run, after difficult negotiations, in large

software systems, use cases, documentation, and code all

change to produce systems with new inconsistencies.

However, no one knows what they are or where they are

located!

Furthermore there is no evident way to divide up the code,

documentation, and use cases into meaningful, consistent

microtheories for human-computer interaction.

Organizations such as Microsoft, the US government,

and IBM have tens of thousands of employees pouring

over hundreds of millions of lines of documentation,

code, and use cases attempting to cope. In the course of

time almost all of this code will interoperate using Web

6 Popper [1934] section 30.

Page 4: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 4 of 36

Services. A large software system is never done

[Rosenberg 2007].

The thinking in almost all scientific and engineering work

has been that models (also called theories or microtheories)

should be internally consistent, although they could be

inconsistent with each other.7

Consistency has been the bedrock of mathematics

When we risk no contradiction,

It prompts the tongue to deal in fiction.

Gay [1727]

Platonic Ideals8 were to be perfect, unchanging, and

eternal.9 Beginning with the Hellenistic mathematician

7 Indeed some researchers have even gone so far as to construct

consistency proofs for some small software systems, e.g., [Davis

and Morgenstern 2005] in their system for deriving plausible

conclusions using classical logical inference for Multi-Agent

Systems. In order to carry out the consistency proof of their system, Davis and Morgenstern make some simplifying

assumptions:

No two agents can simultaneously make a choice

(following [Reiter 2001] ).

No two agents can simultaneously send each other

inconsistent information.

Each agent is individually serial, i.e., each agent can

execute only one primitive action at a time.

There is a global clock time.

Agents use classical Speech Acts (see [Hewitt 2006b

2007a, 2007c, 2008c]).

Knowledge is expressed in first-order logic.

The above assumptions are not particularly good ones for

modern systems (e.g., using Web Services and many-core computer architectures). [Hewitt 2007a]

The following conclusions can be drawn for documentation, use cases, and code of large software systems for human-

computer interaction:

Consistency proofs are impossible for whole systems.

There are some consistent subtheories but they are

typically mathematical. There are some other

consistent microtheories as well, but they are

small, make simplistic assumptions, and typically

are inconsistent with other such microtheories

[Addanki, Cremonini and Penberthy 1989]. Nevertheless, the Davis and Morgenstern research programme

to prove consistency of microtheories can be valuable for the theories to which it can be applied. Also some of the techniques

that they have developed may be able to be used to prove the

consistency of the mathematical fragment of Direct Logic and to

prove the paraconsistency of inconsistent theories in Direct Logic (see below in this paper). 8 “The world that appears to our senses is in some way defective

and filled with error, but there is a more real and perfect realm, populated by entities [called “ideals” or “forms”] that are eternal,

changeless, and in some sense paradigmatic for the structure and

Euclid [circa 300BC] in Alexandria, theories were

intuitively supposed to be both consistent and complete.

Wilhelm Leibniz, Giuseppe Peano, George Boole,

Augustus De Morgan, Richard Dedekind, Gottlob Frege,

Charles Peirce, David Hilbert, etc. developed mathematical

logic. However, a crisis occurred with the discovery of the

logical paradoxes based on self-reference by Cesare

Burali-Forti [1897], Cantor [1899], Bertrand Russell

[1903], etc. In response Russell [1908] stratified types,

[Zermelo 1905, Fränkel 1922, Skolem 1922] stratified sets

and [Tarski and Vaught 1957] stratified logical theories to

limit self-reference. Kurt Gödel [1931] proved that

mathematical theories are incomplete, i.e., there are

propositions which can neither be proved nor disproved.

Consequently, although completeness and unrestricted self-

reference were discarded for general mathematics, the

bedrock of consistency remained.

Paraconsistency has been around for a while. So

what’s new?

Within mathematics paraconsistent10 logic was developed

to deal with inconsistent theories. The idea of

paraconsistent logic is to be able to make inferences from

character of our world. Among the most important of these

[ideals] (as they are now called, because they are not located in space or time) are Goodness, Beauty, Equality, Bigness, Likeness,

Unity, Being, Sameness, Difference, Change, and Changelessness. (These terms — “Goodness”, “Beauty”, and so

on — are often capitalized by those who write about Plato, in order to call attention to their exalted status;…) The most

fundamental distinction in Plato's philosophy is between the many observable objects that appear beautiful (good, just, unified,

equal, big) and the one object that is what Beauty (Goodness, Justice, Unity) really is, from which those many beautiful (good,

just, unified, equal, big) things receive their names and their corresponding characteristics. Nearly every major work of Plato is, in some way, devoted to or dependent on this distinction. Many

of them explore the ethical and practical consequences of

conceiving of reality in this bifurcated way. We are urged to

transform our values by taking to heart the greater reality of the [ideals] and the defectiveness of the corporeal world.” [Kraut

2004] 9 Perfection has traditionally been sought in the realm of the

spiritual. However, Ernest Kurtz and Katherine Ketcham [1993]

expounded on the thesis of the “spirituality of imperfection”

building on the experience and insights of Hebrew prophets, Greek thinkers, Buddhist sages, Christian disciples and

Alcoholics Anonymous. This is spirituality for the “imperfect because it is real and because imperfect has the possibility to be

real.” As Leonard Cohen said “There is a crack in everything: that's how the light gets in.” The conception that they present is

very far from the Platonic Ideals of being perfect, unchanging,

and eternal. 10 Name coined by Francisco Miró Quesada in 1976 [Priest 2002,

pg. 288].

Page 5: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 5 of 36

inconsistent information without being able to derive all

propositions, property called “simple paraconsistency” in

this paper in contrast to “strong paraconsistency” which is

discussed below.

The most extreme form of simple paraconsistent

mathematics is dialetheism [Priest and Routley 1989]

which maintains that there are true inconsistencies in

mathematics itself e.g., the Liar Paradox. However,

mathematicians (starting with Euclid) have worked very

hard to make their theories consistent and inconsistencies

have not been an issue for most working mathematicians.

As a result:

Since inconsistency was not an issue, mathematical

logic focused on the issue of truth and a model

theory of truth was developed [Dedekind 1888,

Löwenheim 1915, Skolem 1920, Gödel 1930,

Tarski and Vaught 1957, Hodges 2006]. More

recently there has been work on the development

of an unstratified logic of truth [Leitgeb 2007,

Feferman 2007a].11

Simple Paraconsistent logic somewhat languished

for lack of subject matter. The lack of subject

matter resulted in simple paraconsistent proof

theories that were for the most part so awkward as

to be unused for mathematical practice.

Consequently mainstream logicians and mathematicians

have tended to shy away from simple paraconsistency.

One of the achievements of Direct Logic is the

development of an unstratified reflective strongly

paraconsistent12

inference system with mathematical

induction that does minimal damage to traditional

natural deductive logical reasoning.

Previous simple paraconsistent logics have not been

satisfactory for the purposes of Software Engineering

11 Of course, truth is out the window as a semantic foundation for

the inconsistent theories of large software systems! 12

The basic idea of Strong Paraconsistency is that no nontrivial

inferences should be possible from the mere fact of an

inconsistency.

By the principle of simple paraconsistency, in the empty

theory (that has no axioms beyond those of Direct Logic),

there is a proposition such that

P, P ⊬⊥

However, for the purposes of reasoning about large

software systems, a stronger principle is needed. The principle of strong paraconsistency is stronger than simple

paraconsistency in that it requires P, P, Q ⊬⊥ Q

because the inconsistency between P and P is not relevant

to Q. Of course, the following trivial inference is possible event

with strong paraconsistency:

P, P ├⊥ (Q ├⊥ P) and so forth

because of their many seemingly arbitrary variants and

their idiosyncratic inference rules and notation. For

example (according to Priest [2006]), most simple

paraconsistent and relevance logics rule out Disjunctive

Syllogism ((ΦΨ), ¬Φ ├ Ψ).13 However, Disjunctive

Syllogism seems entirely natural for use in Software

Engineering!

Direct Logic

The proof of the pudding is the eating. Cervantes [1605] in Don Quixote. Part 2. Chap. 24

Direct Logic14 is an unstratified strongly paraconsistent reflective formalism for using inference for large software systems with the following goals: Provide a foundation for strongly paraconsistent

theories in Software Engineering. Formalize a notion of “direct” inference for strongly

paraconsistent theories. Support all “natural” deductive inference [Fitch

1952; Gentzen 1935] in strongly paraconsistent theories with the exception of general Proof by Contradiction and Disjunction Introduction.15

Support all the usual Boolean equivalences among propositions with exception of absorption, which must be restricted one way to avoid IGOR.

Support mutual reflection among code, documentation, and use cases of large software systems.

Provide increased safety in reasoning about large software systems using strongly paraconsistent theories.

Direct Logic supports inference for a strongly

paraconsistent reflective theory T (├T

).16 Consequently,

├T

does not support either general indirect inference (proof

by contradiction) or disjunction introduction. However, ├T

13 Indeed according to Routley [1979] “The abandonment of

disjunctive syllogism is indeed the characteristic feature of the relevant logic solution to the implicational paradoxes.” 14 Direct Logic is distinct from the Direct Predicate Calculus

[Ketonen and Weyhrauch 1984]. 15 In this respect, Direct Logic differs from Quasi-Classical Logic [Besnard and Hunter 1995] for applications in information

systems, which does include Disjunction Introduction. 16 Direct Logic also supports ├ which is a generalization of

classical mathematical logic and consequently supports general

indirect inference (proof by contradiction) as well as disjunction

introduction.

Although the semi-classical fragment of Direct Logic (├) is

presumably consistent, because the Gödelian paradoxical

sentence is self-provable in every paraconsistent reflective theory

T, ├T

is necessarily inconsistent. See discussion below

Page 6: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 6 of 36

does support all other rules of natural deduction [Fitch

1952].17 Consequently, Direct Logic is well suited for

practical reasoning about large software systems.18

The theories of Direct Logic are “open” in the sense of open-

ended schematic axiomatic systems [Feferman 2007b]. The

language of a theory can include any vocabulary in which its

axioms may be applied, i.e., it is not restricted to a specific

vocabulary fixed in advance (or at any other time). Indeed a

theory can be an open system can receive new information at

any time [Hewitt 1991, Cellucci 1992].

Direct Logic is based on argument rather than truth

Finally there are simple ideas of which no definition can be given; there are also axioms or postulates, or in

a word primary principles, which cannot be proved and have no need of proof. Gottfried Leibniz

Partly in reaction to Popper19, Lakatos [1967, §2]) calls the

view below Euclidean (although there is, of course, no

claim concerning Euclid‟s own orientation):

“Classical epistemology has for two thousand years

modeled its ideal of a theory, whether scientific or

mathematical, on its conception of Euclidean

geometry. The ideal theory is a deductive system with

an indubitable truth-injection at the top (a finite

conjunction of axioms)—so that truth, flowing down

from the top through the safe truth-preserving

channels of valid inferences, inundates the whole system.”

Since truth is out the window for inconsistent theories, we

have the following reformulation:

Inference in a theory T (├T

) carries argument

from antecedents to consequents in chains of

inference.

17 But with the modification that Ψ├

T

Φ does not automatically

mean that ├T

(Ψ⇨Φ). See discussion below. 18 In this respect, Direct Logic differs from previous

paraconsistent logics, which had inference rules that made them

intractable for use with large software systems. 19

Indirect inference has played an important role in science

(emphasized by Karl Popper [1962]) as formulated in his

principle of refutation which in its most stark form is as follows:

If ├T

Ob for some observation Ob, then it can be

concluded that T is refuted (in a theory called Popper),

i.e., ├ Popper T

Each of the fundamental principles20

of

Direct Logic below holds in every theory,

both the semi-classical theory (├21) and

every strongly paraconsistent theory.

The only exceptions are as follows:

1. The following hold only for ├:22

(├ , ) ⇨ (├ )

├ () 2. Reification reflection

23 does not hold

for ├.

Syntax of Direct Logic

Direct Logic has the following syntax:

If and are propositions then, (negation),

(conjunction), (disjunction), ⇨

(implication), and ⇔ (bi-implication) are

propositions.

Atomic names are expressions.24 Also numbers are

expressions.

If x1, …, and xn are variables and is a proposition,

then the following is a proposition that says “for all x1,

…, and xn: holds:

x1; …; xn :

If F is an expression and E1, …, En are expressions,

then F(E1, …, En) is an expression.

If X1, …, Xn are identifiers and E is an expression,

then (λ(X1, …, Xn) E) is an expression.

If E1, E2, and E3 are expressions, then the following

are expressions:

if E1 then E2 else E3

E1 = = E2 (E1 and E2 are the same Actor)

If E1, …, En are expressions, then [E1, …, En] (the

sequence of E1, …, and En) is an expression

20 The fundamental principles of Direct Logic are placed in boxes

like this one and they are not independent. 21 It is important not to confuse the classical theory ├ with the

empty paraconsistent theory ├⊥. that has no axioms beyond those

of Direct Logic. The theory ├ is presumably consistent whereas

the theory ├⊥ is inconsistent (as shown later in this paper). 22 Consequently, the classical deduction theorem holds:

(├ (ΨΦ)) ⇔ (Ψ├ Φ) 23 Defined and discussed later in this paper. 24 For example., Fred and x are atomic names. An atomic name

is either a constant, variable or identifier. Variables are

universally quantified and identifiers are bound in λ- expressions.

As a convention in this paper, the first letter of a constant will be

capitalized.

Page 7: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 7 of 36

If E1 and E2 are expressions, [E1 E2] (the sequence

of E1 followed by the elements of the sequence E2) is

an expression

If X is a variable, E is an expression, and is a

proposition, then {XE | } (the set of all X in E such

that ) is an expression.

If E1 and E2 are expressions, then E1=E2, E1E2 and

E1E2 are propositions

If P is an expression and E1, …, En are expressions,

then P[E1, …, En] is a proposition.

If E1 and E2 are expressions, then E1 E2 (E1 can

reduce to E2 in the nondeterministic λ-calculus) is a

proposition.

If E is an expression, then E (E always converges in

the nondeterministic λ-calculus) is a proposition.

If E is an expression, then E (E is irreducible in the

nondeterministic λ-calculus) is a proposition.

If E1 and E2 are expressions, then E1E2 (E1 can

converge to E2 in the nondeterministic λ-calculus) is a

proposition.

If E is an expression, then 1E (E reduces to exactly 1

expression in the nondeterministic λ-calculus) is a

proposition.

If T is an expression and is a proposition, then

├T

( is provable in T ) is a proposition.

If T is an expression and 1, …, k are propositions

and 1, …, n are propositions then

1, …, k ├T

1, …, n is a proposition that says 1,

… and, k infer

1, …, and n in T.

If T is an expression, E is an expression and is a

proposition, then E╟T

(E is a proof of in T ) is a

proposition.

If s is a sentence (in XML25). then s (the

abstraction of s) is a proposition. If p is a phrase (in

XML), then p (the abstraction of p) is an

expression.26

If is a proposition, then (the reification of ) is

a sentence (in XML). If E is an expression, then

E (the reification of E) is a phrase (in XML).

In general, the theories of Direct Logic are inconsistent and

therefore propositions cannot be consistently labeled with

truth values. Consequently, Direct Logic differentiates

expressions (that do have values) from propositions (that

do not have values).

25 Computer science has standardized on XML for the (textual)

representation of tree structures. 26 For example, λ(x) x=0 is an expression. In this respect

Direct Logic differs from Lambda Logic [Beeson 2004], which

does not have abstraction and reification.

Note that Direct Logic does not have quantifiers, but

universally quantified variables are allowed at the top level

in statements.27

Soundness, Faithfulness, and Adequacy

Soundness in Direct Logic is the principle that the rules of

Direct Logic preserve arguments, i.e.,

Soundness: (├T

) ├T

((├T

) ├T

(├T

)) if an inference holds and furthermore if the

antecedent of the inference is a theorem, then the

consequence of the inference is a theorem

Adequacy is the property that if an inference holds, then

the theory in which the inference holds is adequate to

prove the proposition that the inference hold, i.e.,

Faithfulness is the property that if a theory proves the

proposition that an inference holds, then the theory

faithfully proves the inference, i.e.,

Faithfulness: (├T

(├T

)) ├T

(├T

) if the proposition that an inference holds is

provable, then the inference holds..

27 Consider following statement S:

p,qHumans: Mortal[ACommonAncestor(p, q)]

where the syntax has been extended in the obvious way to allow

constraints on variables.

An instantiation of S can be specified by supplying values for

variables. For example S[Socrates, Plato] is the proposition

Socrates,PlatoHumans⇨ Mortal[ACommonAncestor(Socrates, Plato)] Note that care must be taken in forming the negation of

statements.

Direct Logic directly incorporates Skolemization unlike Lambda Logic [Beeson 2004], classical first-order set theory, etc. For

example the negation of S is the proposition

(PS,QSHumans ⇨ Mortal[ACommonAncestor(PS QS)])

where PS and QS are Skolem constants. See the axiomatization of

set theory in the first appendix for further examples of the use of

Skolem functions in Direct Logic (See Appendix 1)

Adequacy: (├T

) ├T

(├T

(

├T

)) if an inference holds, then it is provable that it

holds

Page 8: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 8 of 36

Direct Logic has the following housekeeping rules:28

Self-refutation (Direct Indirect Inference)

“Contrariwise,” continued Tweedledee, “if it was so,

it might be; and if it were so, it would be; but as it

isn't, it ain't. That's logic.” Carroll [1871]

Direct Logic supports direct versions of indirect inference for strongly paraconsistent theories as follows:29

28 Nontriviality principles have also been proposed as extensions

to Direct Logic including the following:

Direct Nontriviality: () ├T

(├T

)

the negation of a proposition infers that it cannot be proved

Meta Nontriviality: (├T

) ├T

(├T

)

the provability of the negation of a proposition infers that the proposition cannot be proved.

29 Direct Logic does not support either the Principle of Full

Indirect Inference (Ψ├T

Φ, ¬Φ) ├T

¬Ψ or the Principle of

disjunction introduction Ψ ├T

(ΨΦ).

Simple Self-refutation: (├

T ) ├

T

(├T

)

which states that a proposition can be disproved by showing that the proposition infers its own negation.

Right Meta Self-refutation: (├

T

(├T

)) ├T

(├T

)

which states that a proposition can be disproved by showing that the proposition infers a proof of its own negation.

Left Meta Self-refutation: ((├

T

)├T

) ├T

(├T

)

which states that provability of a proposition can be disproved by showing that its provability infers its own negation.

Both Meta Self-refutation: ((├

T

)├ T

(├T

)) ├T

(├T

)

which states that provability of a proposition can be disproved by showing that its provability infers provability of its negation.

Self-refutation (Direct Indirect Proof) can sometimes do

inferences that are traditionally done using Full Indirect

Inference. For example the proof of the incompleteness of

theories in this paper makes use of Self-refutation.

Booleans

The Booleans30 in Direct Logic are as close to classical

logic as possible.

Negation

We could not even think „being‟ without a double

negation: being must be thought as the negation of negation of being. Paul Tillich [2000]

The following is a fundamental principle of Direct Logic:

Double Negation Elimination:

┤├T

31

Other fundamental principles for negation are found in the

next sections.

Conjunction and Disjunction

Direct Logic tries to be as close to classical logic as

possible in making use of natural inference, e.g., “natural

30 (negation), (conjunction), (disjunction), and

(implication), 31 ┤├

T

means ├

T

and ├

T

Reiteration: ├T

a proposition infers itself

Exchange: ,├T

,

the order of propositions are written does not matter

Residuation: (,├T

) ┤├T

(├T

(├T

))

hypotheses may be freely introduced and discharged

Monotonicity: (├T

) ├T

(, ├T

)

an inference remains if new information is added

Dropping: (├T

,) ├T

(├T

)

an inference remains if extra conclusions are dropped

Independent inference: ((├T

), (├T

)) ┤├T

(├T

,)

inferences can be combined

Transitivity: ((├T

) ] (├T

)) ├T

(├T

)

inference is transitive

Variable Elimination: (x: P[x]) ├T

P[E]

a universally quantified variable of a statement can be instantiated with any expression E (taking care that none of

the variables in E are captured).

Variable Introduction: Let Z be a new constant

(├T

P[Z] ) ├T

(├T

x: P[x])

proving a statement with a universally quantified variable is equivalent to proving the statement with a newly

introduced constant substituted for the variable

Page 9: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 9 of 36

deduction”. Consequently, we have the following

equivalences for juxtaposition (comma):

Conjunction in terms of Juxtaposition (comma):

(, ├T

) ┤├T

()├T

(├T

, ) ┤├T

├T

()

Direct Logic defines disjunction () in terms of

conjunction and negation in a fairly natural way as follows:

Disjunction in terms of

Conjunction and Negation:

┤├T

()

Since Direct Logic aims to preserve standard Boolean

properties, we have the following principle:

Conjunction infers Disjunction:

() ├T

()

Implication

Lakatos characterizes his own view as quasi-empirical:

“Whether a deductive system is Euclidean or quasi-

empirical is decided by the pattern of truth value flow

in the system. The system is Euclidean if the

characteristic flow is the transmission of truth from

the set of axioms „downwards‟ to the rest of the

system—logic here is an organon of proof; it is

quasi-empirical if the characteristic flow is

retransmission of falsity from the false basic

statements „upwards‟ towards the „hypothesis‟—logic

here is an organon of criticism.”

Direct Logic defines implication (⇨) in terms of

conjunction and negation in a fairly natural way as follows:

Implication in terms of

Conjunction and Negation:

⇨ ┤├T

( )

Two-way Deduction Theorem

In classical logic there is a strong connection between

deduction and implication through the Classical Deduction

Theorem:

(├(Ψ⇨Φ)) ┫├ Ψ├

Φ

However, the classical deduction theorem does not hold in

general for paraconsistent theories of Direct Logic.32

Instead, Direct Logic has a Two-way Deduction Theorem

that is explained below.

Lemma (├T

()) ├T

((├T

) (├T

))

Proof: Suppose ├T

()

Therefore ├T

()

By Disjunctive Syllogism, it follows that ├T

and

├T

.

What about the converse of the above theorem?

Lemma ((├T

) (├T

)) ├T

(├T

())

Proof: Suppose ├T

and ├T

By the principle of Simple Self-refutation, to prove

├T

(), it is sufficient to prove the following:

()├T

()

Thus it is sufficient to prove ()├T

()

But ()├T

()

├T

() by the

suppositions above and the principle that Conjunction

Infers Disjunction. Putting the above two theorems together we have the Two-

Way Deduction Theorem for Implication:

(├T

()) ┫├T

((├T

) (├T

))

Consequently:

In Direct Logic, implication carries argument

both ways between antecedents and consequents

in chains of implication. Thus, in Direct Logic, implication (), rather than

inference (├T

), supports Lakatos quasi-empiricism.

The following corollaries follow:

* Two-Way Deduction Theorem for Disjunction:

├T

()) ┫├T

((├T

) ] (├T

))

* Transitivity of Implication:

(), () ├T

() Proof: Follows immediately from the Two-Way

Deduction Theorem for Implication by chaining

in both directions for .

* Reflexivity of Implication: ├T

()

32 For example, in the empty strongly paraconsistent theory⊥ (that has no axioms beyond those of Direct Logic),

Q├⊥ (P P) but ⊬⊥ (Q (P P)).

Page 10: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 10 of 36

Proof: Follows immediately from ├T

and

├T

using the Two-Way Deduction

Theorem.

Boolean equivalences

Theorem: The following usual Boolean equivalences hold:

Double Negation: ⇔

Idempotence of : ⇔

Commutativity of : ⇔

Associativity of : () ⇔ ()

Distributivity of over :

() ⇔ () ()

De Morgan for : () ⇔

Idempotence of : ⇔

Commutativity of : ⇔

Associativity of : () ⇔ ()

Distributivity of over :

() ⇔ () ()

De Morgan for : () ⇔

Implication as Disjunction: (⇨) ⇔

Contrapositive: (⇨) ⇔ ⇨

Disjunction Introduction by Negation

The principle of Disjunction by Negation 33 is that a

disjunction always holds for a proposition and its

negation.34 It can be expressed as follows:

Theorem. Disjunction Introduction by Negation:

├T

()

Proof: Follows immediately from Reflexivity of

Implication, the definition of implication, De

Morgan, and Double Negation Elimination.

Additional principles

Since Direct Logic aims to support as much of classical

reasoning as possible, we have the following principles:

33 Often called “Excluded Middle” in classical logic. 34 da Costa [1963] proposed the following operator:

○ () together with the following principle that characterizes Logic for

Formal Inconsistency:

○, , ├T

However, in Direct Logic, ○ is equivalent to and so the

above principle would result in IGOR (see above).

Absorption of : () ├T

Absorption of : () ├T

Disjunctive Splitting by Cases:

(), (├T

), (├T

) ├T

Direct Logic uses strong paraconsistency to

facilitate theory development

Strongly paraconsistent theories can be easier to develop

than classical theories because perfect absence of

inconsistency is not required. In case of inconsistency,

there will be some propositions that can be both proved

and disproved, i.e., there will be arguments both for and

against the propositions.

A classic case of inconsistency occurs in the novel Catch-

22 [Heller 1995] which states that a person “would be

crazy to fly more missions and sane if he didn't, but if he was sane he had to fly them. If he flew them he was crazy

and didn't have to; but if he didn't want to he was sane and

had to. Yossarian was moved very deeply by the absolute

simplicity of this clause of Catch-22 and let out a

respectful whistle. „That's some catch, that Catch-22,‟ he

observed.”

So in the spirit of Catch-22, consider the follow

axiomization of the above:

1. p: AbleToFly[p], Fly[p] ├Catch-22 Sane[p]

axiom

2. p: Sane[p] ├Catch-22 Obligated[p, Fly] axiom

3. p: Sane[p],ObligatedToFly[p]├Catch-22 Fly[p]

axiom

4. ├Catch-22 AbleToFly[Yossarian] axiom

5. Fly[Yossarian] ├Catch-22 Fly[Yossarian] from 1 through 4

6. ├Catch-22 Fly[Yossarian] from 5 using Simple Self-refutation

7. p: Fly[p] ├Catch-22 Crazy[p] axiom

8. p: Crazy[p] ├Catch-22 ObligatedToFly[p] axiom

9. p: Sane[p], ObligatedToFly[p] ├Catch-22 Fly[p]

axiom

10. ├Catch-22 Sane[Yossarian] axiom

11. ├Catch-22 Fly[Yossarian] from 6 through 10

Thus there is an inconsistency in the above theory

Catch-22 in that:

6. ├Catch-22 Fly[Yossarian]

11. ├Catch-22 Fly[Yossarian]

Page 11: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 11 of 36

Various objections can be made against the above

axiomization of the theory Catch-22.35 However, Catch-22

illustrates several important points:

Even a very simple microtheory can engender

inconsistency

Strong paraconsistency facilitates theory

development because a single inconsistency is

not disastrous.

Direct Logic supports fine grained reasoning

because inference does not necessarily carry

argument in the contrapositive direction. For

example, the general principle “A person who

flies is crazy.” (i.e., Fly[p] ├Catch-22 Crazy[p] ) does not support the interference of

Fly[Yossarian] from Crazy[Yossarian]. E.g., it

might be the case that Fly[Yossarian] even though

it infers Crazy[Yossarian] contradicting

Crazy[Yossarian].

Even though the theory Catch-22 is inconsistent, it is not meaningless.

Unstratified Reflection is the Norm

Reflection and self-reference are central to Software

Engineering. Reflection in logic is treated in the sections

below whereas reflection in concurrent programming is

treated in an Hewitt [2009c].

Abstraction and Reification

Direct Logic distinguishes between concrete sentences in

XML and abstract propositions.36 Software Engineering

requires that it must be easy to construct abstract

propositions from concrete sentences.37 Direct Logic

provides abstraction for this purpose as follows:

Every sentence s in XML has an abstraction38

that is the proposition given by s.39

35 Both Crazy[Yossarian] and Sane[Yossarian] can be inferred

from the axiomatization, but this per se is not inconsistent. 36 This is reminiscent of the Platonic divide (but without the

moralizing). Gödel thought that “Classes and concepts may, however, also be conceived as real objects…existing independently of our definitions and constructions.” [Gödel 1944

pg 456] 37 Analogous the requirement that it must be easy to construct

executable code from concrete programs (text). 38 For example, if s and t are sentences in XML, then

<and> s t </and> ⇔ ( s t ) Cf. Sieg and Field [2005] on abstraction. 39 Heuristic: Think of the “elevator bars” . . . around s as

“raising” the concrete sentence s “up” into the abstract

proposition s. The elevator bar heuristics are due to Fanya S.

Montalvo.

Abstraction can be used to formally self-express

important properties of Direct Logic such as the following:

The principle Theorems have Proofs says

that is a theorem of a strongly

paraconsistent theory T if and only if has a

argument that proves it in T, i.e. ╟T

s,tSentences: ├T

s AproofT

(s) ╟T

s

where AproofT

is a choice function that

chooses a proof of s

Furthermore, there is a linear recursive40

ProofCheckerT

such that:

(pProofs; sSentences:

ProofCheckerT (p, s)=1 ⇔ p ╟

T

s )

Conversely, every proposition has a reification41

(given by 42) that is a sentence in XML.43

The sections below address issues concerning the

relationship between abstraction and reification.

The use cases, documentation, and code are becoming

increasingly mutually reflective in that they refer to and

make use of each other. E.g.,

The execution of code can be dynamically checked

against its documentation. Also Web Services can

be dynamically searched for and invoked on the

basis of their documentation.

Use cases can be inferred by specialization of

documentation and from code by automatic test

generators and by model checking.

Code can be generated by inference from

documentation and by generalization from use

cases.

Abstraction and reification are needed for large

software systems so that that documentation, use

cases, and code can mutually speak about what has

been said and its meaning.

40 I.e., executes in a time proportional to the size of its input. 41 Reifications are in some ways analogous to Gödel numbers

[Gödel 1931]. 42 Heuristic: Think of the “elevator bars” . . . around as

“lowering” the abstract proposition “down” into the concrete

sentence that is its reification in XML.

The reifications of a propositions can be quite complex because

of various optimizations that are used in the implementations of

propositions. 43 Note that, if s is a sentence, then in general s s.

s,tSentences: s=t ⇨ ( s ⇔ t )

Page 12: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 12 of 36

However, using abstraction and reification can result in

paradoxes as a result of the Diagonal Argument (explained

below).

Diagonal Argument

The Diagonal Argument has been used to prove many

famous theorems beginning with the proof that the real

numbers are not countable [Cantor 1890, Zermelo 1908].

Proof. Suppose to the contrary that the function

f:ℕℝ enumerates the real numbers that are greater

than equal to 0 but less than 1 so that f(n)i is the ith

binary digit in the binary expansion of f(n) which can

be diagrammed as an array with infinitely many rows

and columns of binary digits as follows:

.f(1)1 f(1)2 f(1)3 … f(1)i …

.f(2)1 f(2)2 f(2)3 … f(2)i …

.f(3)1 f(3)2 f(3)3 … f(3)i … …

.f(i)1 f(i)2 f(i)3 … f(i)i … …

Define Diagonal as follows:

Diagonal ≡ Diagonalize(f)

where Diagonalize(g) ≡44 λ(i) g(i)i

where g(i)i is the complement of g(i)i Diagonal can be diagrammed as follows:

.f(1)1 f(1)2 f(1)3 … f(1)i …

.f(2)1 f(2)2 f(2)3 … f(2)i …

.f(3)1 f(3)2 f(3)3 … f(3)i … …

.f(i)1 f(i)2 f(i)3 … f(i)i … …

Therefore Diagonal is a real number not enumerated

by f because it differs in the ith digit of every f(i). The Diagonal Argument is used in conjunction with the

Logical Fixed Point theorem that is described in the next

section.

Logical Fixed Point Theorem

The Logical Fixed Point Theorem enables propositions to

effectively speak of themselves .

In this paper, the fixed point theorem is used to

demonstrate the existence of self-referential sentences that

will be used to prove theorems about Direct Logic using

the Diagonal Argument.

Theorem [a λ-calculus version of Carnap 1934 pg 91 after

Gödel 1931]45:

44 The symbol “≡” is used for “is defined as”.

Let f be a total function from Sentences to

Sentences46

├T

( Fix(f) ⇔ f(Fix(f)) )

where Fix(f) ≡ Θ(Θ) which exists because f always converges

where Θ ≡ λ(g) f(λ(x) (g(g))(x))47 Proof

Fix(f) = Θ(Θ) = λ(g) f(λ(x) (g(g))(x)) (Θ) = f (λ(x) (Θ(Θ))(x)) = f (Θ(Θ)) by functional abstraction on Θ(Θ) = f(Fix(f))

Fix(f) ⇔ f(Fix(f))48 by abstraction of equals

Disadvantages of stratified metatheories

To avoid inconsistencies in mathematics (e.g., Liar

Paradox, Russell‟s Paradox, Curry‟s Paradox, etc.), some

restrictions are needed around self-reference. The question

is how to do it [Feferman 1984a, Restall 2006].49

The approach which is currently standard in

mathematics is the Tarskian framework of assuming that

there is a hierarchy of metatheories in which the semantics

of each theory is formalized in its metatheory [Tarski and

Vaught 1957].

According to Feferman [1984a]:

“…natural language abounds with directly or

indirectly self-referential yet apparently harmless

expressions—all of which are excluded from the

Tarskian framework.” Large software systems likewise abound with directly or

indirectly self-referential propositions in reasoning about

their use cases, documentation, and code that are excluded

by the Tarskian framework. Consequently the assumption

of hierarchical metatheories is not very suitable for

Software Engineering.

45 Credited in Kurt Gödel, Collected Works vol. I, p. 363, ftn. 23.

However, Carnap, Gödel and followers did not use the λ calculus

and consequently their formulation is more convoluted. 46 Note that f is an ordinary Lisp-like function except that

Sentences (a subset of XML) are used instead of S-expressions. 47 Where did the definition of Θ come from? First note that

λ(x) (g(g))(x) = g(g) and consequently

Θ = λ(g) f(g(g)) So Θ takes itself as an argument and returns the result of applying f to the result of applying itself to itself! In this way a fixed point

of f is constructed. 48 Note that equality (=) is not defined on abstract propositions

(like Fix(f) ). Also note that logical equivalence () is not

defined on concrete XML sentences (like Fix(f) ). 49 According to [Priest 2004], "the whole point of the dialetheic

solution to the semantic paradoxes is to get rid of the distinction

between object language and meta-language".

Page 13: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 13 of 36

But paradoxes loom: the Liar Paradox goes back at least

as far as the Greek philosopher Eubulides of Miletus who

lived in the fourth century BC. It could be put as follows:

LiarProposition is defined to be the proposition “The negation of LiarProposition holds.”

From its definition, LiarProposition holds if and only if it

doesn‟t!

The argument can be formalized using the fixed point

theorem and the diagonal argument in the following way:

LiarProposition ≡ Fix(Diagonalize)

where Diagonalize ≡ λ(s) s 50

The Liar Paradox can be stated as follows:

LiarProposition ⇔ LiarProposition Argument for the Liar Paradox

51

LiarProposition ⇔ Fix(Diagonalize)

⇔ Diagonalize(Fix(Diagonalize)) by the fixed point theorem

⇔ λ(s) s (Fix(Diagonalize))

⇔ Fix(Diagonalize)

⇔ LiarProposition

⇔ LiarProposition step above is not valid in Direct Logic

In order not to be plagued by paradoxes such as the one

above, Direct Logic adopts the approach of the restricting

the kinds of proposition that can be used the last step in the

above kinds of arguments as discussed in the next section.

Reification Reflection

Direct Logic makes use of the following principle:

The Reification Reflection Principle for

paraconsistent theories of Direct Logic52 is

that if is Admissible for T then:

├T

( ⇔ )

Of course, the above criterion begs the questions of which

propositions are Admissible in T! A proposed answer is

provided by the following:

50 Note that Diagonalize always converges. 51 As explained below, this argument is not valid in Direct Logic. 52 Note that Reification Reflection does not apply to the semi-

classical theory├ .

The Criterion of Admissibility for Direct

Logic is53

:

is Admissible for T if and only if

() ├T

(├T

)

I.e., the Criterion of Admissibility is that a proposition is

Admissible for a theory T if and only if its negation infers

in T that its negation is provable in T.54

Theorem. If and are Admissible for T, then is

Admissible for T.

Proof. Suppose and are Admissible for T, i.e.,

() ├T

(├T

) and () ├T

(├T

). The goal is

to prove () ├T

(├T

()), which is

equivalent to (]) ├T

(├T

(])), which

follows immediately from the hypothesis.

Theorem. If and are Admissible for T, then ⇨

is Admissible for T.

Proof. (⇨) (). Therefore the theorem

follows from the previous theorem by Double Negation

Elimination.

53

Note that there is an asymmetry in the definition of

Admissibility with respect to negation. In general, it does not

follow that is admissible for T just because is admissible

for T. The asymmetry in Admissibility is analogous to the

asymmetry in the Criterion of Refutability [Popper 1962]. For

example the sentence “There are no black swans.” is readily refuted by the observation of a black swan. However, the

negation is not so readily refuted. Also note that admissibility is different from the following:

├T

( ⇨ ├T

)

which is equivalent to the following:

├T

((├T

) ⇨ )

The above statement illustrates a problem with the traditional concept of “Negation as Failure” that was first noted in

connection with the development of Planner, namely, “The dumber the system, the more it can prove!” See the discussion

on the limitations of Logic Programming. 54 Admissibility is a generalization of the property of being

GoldbachLike (emphasized by [Franz[n 2005]) which is defined

to be all sentences s of arithmetic (ℕ) such that

fExpressions s=nω f(n) BoundedQuantification(f) where BoundedQuantification(f) means that all the

quantifiers in f are bounded, i.e., all quantifiers are of one of

the following two forms:

1. variableexpression…

2. variable expression…

where variable does not appear in expression

Theorem. If is Goldbach-like, then is Admissible for ℕ.

Page 14: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 14 of 36

The motivation for Admissibility builds on the denotational semantics of the Actor model of computation which were first developed in [Clinger 1981]. Subsequently [Hewitt 2006] developed the TimedDiagrams model with the Computational Representation Theorem which states:

The denotation DenoteS of a closed system S represents all the possible behaviors of S as

DenoteS = ⊔iω ProgressionSi(⊥S)

where ProgressionS is an approximation function that takes a set of approximate behaviors to their next stage and ⊥S is the initial behavior of S.

In this context, is Admissible for S means that

implies that there is a counter example to in DenoteS so

that in the denotational theory S induced by the system S:

() ├S

(├S)

Theorem. For every which is Admissible for T, there is

a proof such that:

├T

ProofCheckerT

( , )=1

The argument of the Liar Paradox is not valid for

theories in Direct Logic.

The argument of the Liar Paradox is not valid in Direct

Logic because presumably LiarProposition is not

Admissible for (where is the empty strongly

paraconsistent theory that has no axioms beyond those of

Direct Logic) and consequently the Reification Reflection

Principle of Direct Logic does not apply.

Likewise other standard paradoxes do not hold in Direct

Logic.55

55 For example, Russell‟s Paradox , Curry‟s Paradox, and the

Kleene-Rosser Paradox are not valid for paraconsistent theories in

Direct Logic because, in the empty theory (that has no axioms

beyond those of Direct Logic):

Russell’s Paradox:

Russell ≡ Fix(Diagonalize)

where Diagonalize ≡ λ(s) ├ s

∴ Russell ⇔ ├ Russell

But presumably ├ Russell is not Admissible for

Curry’s Paradox:

Curry ≡ Fix(Diagonalize)

where Diagonalize ≡ λ(s) s⇨

Incompleteness Theorem for Theories of Direct

Logic

Incompleteness of a theory T is defined to mean that there

is some proposition such that it cannot be proved and

neither can its negation, i.e., a theory T is incomplete if

and only if there is a proposition such that

(├T

) (├T

)

The general heuristic for constructing such a sentence is

to construct a proposition that says the following:

Such a proposition (called UnprovableT

) can be

constructed as follows using the fixed point theorem and

diagonalization:

∴ Curry Curry ⇨

But presumably, in general Curry⇨ is not Admissible for

Kleene-Rosser Paradox:

KleeneRosser ≡ Diagonalize(Diagonalize)

where Diagonalize ≡ λ(f) f(f)

∴ KleeneRosser ⇔ KleeneRosser

But presumably KleeneRosser is not Admissible for

Paradox of Provability

Provable ≡ Fix(Diagonalize)

where Diagonalize ≡ λ(s) ├ s

∴ Provable ⇔ ├ Provable

But presumably├ Provable is not Admissible for

UnprovableT

≡ Fix(Diagonalize)

where Diagonalize ≡ λ(s) ├T

s 1 Diagonalize(s) is a sentence that says that

s is not provable in T

This proposition is not provable in T.

Page 15: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 15 of 36

The following lemma verifies that UnprovableT

has the

desired property:

Lemma: ├T

(UnprovableT

⇔ ├T

UnprovableT

)

Proof:

First show that ├T

UnprovableT

is Admissible

for T Proof: We need to show the following:

((├T

UnprovableT

) ├T

(├T

(├T

UnprovableT

))

which by double negation elimination is equivalent

to showing

(├T

UnprovableT

) ├T

(├T

├T

UnprovableT

) which follows immediately from adequacy.

UnprovableT

⇔ Fix(Diagonalize)

Diagonalize(Fix(Diagonalize)) logical fixed point theorem

⇔ λ(s) ├T

s (Fix(Diagonalize))

definition of Diagonalize

⇔ ├T

Fix(Diagonalize)

⇔ ├T

UnprovableT

⇔ ├T

UnprovableT

by Admissibility of ├T

UnprovableT

Theorem: Theories in Direct Logic are self-provably

incomplete.

It is sufficient to prove the following:

1. ├T

├T

UnprovableT

2. ├T

├T

UnprovableT

Proof of Theorem:

1) To prove: ├T

├T

UnprovableT

├T

(UnprovableT

⇨ ├T

UnprovableT

)

lemma

UnprovableT

├T

├T

UnprovableT

deduction

theorem

(├T

UnprovableT

) ├T

(├T

├T

UnprovableT

)

soundness

├T

├T

UnprovableT

using Simple Self-refutation

2) To prove: ├T

├T

UnprovableT

├T

(UnprovableT

⇨ ├T

UnprovableT

)

contrapositive of lemma

UnprovableT

├T

(├T

UnprovableT

)

deduction theorem

(├T

UnprovableT

)├T

(├T

├T

UnprovableT

)

soundness

(├T

UnprovableT

) ├T

(├T

UnprovableT

))

faithfulness

├T

├T

UnprovableT

using Both Meta Self-refutation

However, as shown in the next section, a

consequence of self-provable incompleteness is

inconsistency.

Inconsistency Theorem for Theories of Direct Logic

“Then logic would force you to do it.”

Carroll [1895] (emphasis added)

Theorem: Theories in Direct Logic are self-provably

inconsistent.56

It is sufficient to show that T proves both ParadoxT

and

its negation, i.e.,

1. ├T

ParadoxT

2. ├T

ParadoxT

Proof of theorem

1). ├T

ParadoxT

is immediate from the

incompleteness theorem.

56 This theorem is closely related to dialetheism [Priest and

Routley 1989] which made the claim that mathematics is

inconsistent (e.g. because of the Liar Paradox). Although the

semi-classical mathematical fragment of Direct Logic is evidently consistent, every reflective paraconsistent theory of Direct Logic

is necessarily inconsistent because it self-proves the Gödelian

paradoxical sentence, cf. [Routley 1979], [Priest and Tanaka

2004], etc.

ParadoxT

≡ ├T

UnprovableT

Page 16: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 16 of 36

2) To prove ├T

ParadoxT

(├T

UnprovableT

) ├T

UnprovableT

lemma

(├T

├T

UnprovableT

) ├T

(├T

UnprovableT

)

soundness

├T

UnprovableT

transitivity of inference from

incompleteness theorem

├T

├T

UnprovableT

adequacy

├T

ParadoxT

definition of ParadoxT

Consequences of Logically Necessary Inconsistency

All truth passes through three stages:

First, it is ridiculed.

Second, it is violently opposed.

Third, it is accepted as being self-evident.

Arthur Schopenhauer (1788-1860)

But all is not lost because the following can be said

about this logically necessary inconsistency:

Because T is strongly paraconsistent, that T is

inconsistent about ParadoxT

(by itself) should not

affect other reasoning. Also the subject matter of

ParadoxT

is not of general interest in software

engineering and should not affect reasoning about

current large software systems. So do software

engineers need to care that T is inconsistent about

ParadoxT

as opposed to all the other inconsistencies

of T which they care about more?57

The logically necessary inconsistency concerning

ParadoxT

is a nice illustration of how

inconsistencies often arise in large software

systems: “there can be good arguments (proofs) on

both sides for contradictory conclusions”.

A big advantage of strongly paraconsistent logic is that

it makes fewer mistakes than classical logic when dealing

with inconsistent theories. Since software engineers have

to deal with theories chock full of inconsistencies, strong

paraconsistency should be attractive. However, to make it

relevant we need to provide them with tools that are cost

effective.

At first, TRUTH may seem like a desirable property for

propositions in theories for large software systems.

However, because a paraconsistent reflective theory T is

necessarily inconsistent about ParadoxT

, it is impossible

to consistently assign truth values to propositions of T. In

particular it is impossible to consistently assign a truth

57 Of course, there are other inconsistent propositions of the same

ilk, cf., Rosser [1936].

value to the proposition

ParadoxT

. If the proposition is assigned the value TRUE,

then (by the rules for truth values) it must also be assigned

FALSE and vice versa. It is not obvious what (if anything)

is wrong or how to fix it.

Of course this is contrary to the traditional view of

Tarski. E.g.,

I believe everybody agrees that one of the reasons

which may compel us to reject an empirical theory is

the proof of its inconsistency. . . . It seems to me that

the real reason of our attitude is...: We know (if only

intuitively) that an inconsistent theory must contain false sentences. [Tarski 1944]

On the other hand, Frege [1915] suggested that, in a

logically perfect language, the word „true‟ would not

appear! According to McGee [2006], he argued that “when

we say that it is true that seawater is salty, we don‟t add

anything to what we say when we say simply that seawater

is salty, so the notion of truth, in spite of being the central

notion of [classical] logic, is a singularly ineffectual

notion. It is surprising that we would have occasion to use

such an impotent notion, nevermind that we would regard

it as valuable and important.”

Concurrency is the Norm

Concurrency has now become the norm. However

nondeterminism came first.

Nondeterministic computation

Several models of nondeterministic computation were

developed including the following:

Lambda calculus The lambda calculus can be viewed as

the earliest message passing programming language (see

Hewitt, Bishop, and Steiger 1973; Abelson and Sussman

1985). For example the lambda expression below

implements a tree data structure when supplied with

parameters for a leftSubTree and rightSubTree. When

such a tree is given a parameter message "getLeft", it

returns leftSubTree and likewise when given the message

"getRight" it returns rightSubTree.

λ(leftSubTree, rightSubTree) λ(message) if (message == "getLeft") then leftSubTree else if (message == "getRight") then rightSubTree

However, the semantics of the lambda calculus were

expressed using variable substitution in which the values of

parameters were substituted into the body of an invoked

lambda expression. The substitution model is unsuitable

for concurrency because it does not allow the capability of

sharing of changing resources. Inspired by the lambda

calculus, the interpreter for the programming language

Page 17: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 17 of 36

Lisp made use of a data structure called an environment so

that the values of parameters did not have to be substituted

into the body of an invoked lambda expression. This

allowed for sharing of the effects of updating shared data

structures but did not provide for concurrency.

Petri nets Prior to the development of the Actor model,

Petri nets were widely used to model nondeterministic

computation. However, they were widely acknowledged to

have an important limitation: they modeled control flow

but not data flow. Consequently they were not readily

composable thereby limiting their modularity. Hewitt

pointed out another difficulty with Petri nets: simultaneous

action, i.e., the atomic step of computation in Petri nets is a

transition in which tokens simultaneously disappear from

the input places of a transition and appear in the output

places. The physical basis of using a primitive with this

kind of simultaneity seemed questionable to him. Despite

these apparent difficulties, Petri nets continue to be a

popular approach to modeling nondeterminism, and are

still the subject of active research.

Simula pioneered using message passing for computation,

motivated by discrete event simulation applications. These

applications had become large and unmodular in previous

simulation languages. At each time step, a large central

program would have to go through and update the state of

each simulation object that changed depending on the state

of which ever simulation objects that it interacted with on

that step. Kristen Nygaard and Ole-Johan Dahl developed

the idea (first described in an IFIP workshop in 1967) of

having methods on each object that would update its own

local state based on messages from other objects. In

addition they introduced a class structure for objects with

inheritance. Their innovations considerably improved the

modularity of programs. Simula used nondeterministic

coroutine control structure in its simulations.

Smalltalk-72 Planner, Simula, Smalltalk-72 [Kay 1975;

Ingalls 1983] and computer networks had previously used

message passing. However, they were too complicated to

use as the foundation for a mathematical theory of

concurrency. Also they did not address fundamental issues

of concurrency.

Alan Kay was influenced by message passing in the

pattern-directed invocation of Planner in developing

Smalltalk-71. Hewitt was intrigued by Smalltalk-71 but

was put off by the complexity of communication that

included invocations with many fields including global,

sender, receiver, reply-style, status, reply, operator

selector, etc.

In November 1972 Kay visited MIT and discussed some

of his ideas for Smalltalk-72 building on the Logo work of

Seymour Papert and the "little person" metaphor of

computation used for teaching children to program.

However, the message passing of Smalltalk-72 was quite

complex [Kay 1975]. Code in the language was viewed by

the interpreter as simply a stream of tokens.58 As Dan

Ingalls [1983] later described it:59

The first (token) encountered (in a program) was looked up in the dynamic context, to determine the receiver of

the subsequent message. The name lookup began with

the class dictionary of the current activation. Failing

there, it moved to the sender of that activation and so on

up the sender chain. When a binding was finally found

for the token, its value became the receiver of a new

message, and the interpreter activated the code for that

object's class.60

58 Subsequent versions of the Smalltalk language largely

followed the path of using the virtual methods of Simula in the message passing structure of programs. However Smalltalk-72

made primitives such as integers, floating point numbers, etc. into objects. The authors of Simula had considered making such

primitives into objects but refrained largely for efficiency reasons. Java at first used the expedient of having both primitive

and object versions of integers, floating point numbers, etc. The C# programming language (and later versions of Java, starting

with Java 1.5) adopted the more elegant solution of using boxing and unboxing, a variant of which had been used earlier in some

Lisp implementations. 59 The Smalltalk system went on to become very influential,

innovating in bitmap displays, personal computing, the class

browser interface, and many other ways. Meanwhile the Actor

efforts at MIT remained focused on developing the science and engineering of higher level concurrency

See the 2nd appendix of this paper on how Actors treated meta-

circular evaluation differently than Smalltalk-72 and Briot [1988]

for ideas that were developed later on how to incorporate some kinds of Actor concurrency into later versions of Smalltalk. 60 According to the Smalltalk-72 Instruction Manual [Goldberg

and Kay 1976]:

There is not one global message to which all message "fetches"

(use of the Smalltalk symbols eyeball, ; colon, :, and open

colon, 𝑜𝑜) refer; rather, messages form a hierarchy which we

explain in the following way-- suppose I just received a

message; I read part of it and decide I should send my friend a message; I wait until my friend reads his message (the one I sent

him, not the one I received); when he finishes reading his

message, I return to reading my message. I can choose to let my

friend read the rest of my message, but then I cannot get the

message back to read it myself (note, however, that this can be

done using the Smalltalk object apply which will be discussed

later). I can also choose to include permission in my message to

my friend to ask me to fetch some information from my message

and to give that in information to him (accomplished by

including : or 𝑜𝑜 in the message to the friend). However,

anything my friend fetches, I can no longer have. In other words,

1) An object (let's call it the CALLER) can send a message

to another object (the RECEIVER) by simply mentioning

the RECEIVER's name followed by the message. 2) The action of message sending forms a stack of messages;

the last message sent is put on the top.

3) Each attempt to receive information typically means

looking at the message on the top of the stack.

Page 18: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 18 of 36

Thus the message passing model in Smalltalk-72 was

closely tied to a particular machine model and

programming language syntax that did not lend itself to

concurrency. Also, although the system was bootstrapped

on itself, the language constructs were not formally defined

as objects that respond to Eval messages (see discussion

below).

Computation was conceived in terms of

nondeterministic computation (e.g. Turing machines, Post

productions, the lambda calculus, Petri nets,

nondeterministic simulations, etc.) in which each

computational step changed the global state. However, it

was well known that nondeterministic state machines have

bounded nondeterminism, i.e., if a machine is guaranteed

to halt then it halts in a bounded number of states.61

However, there is no bound that can be placed on how

long it takes a computational circuit called an arbiter to

4) The RECEIVER uses the eyeball, , the colon, :, and the

open colon, 𝑜𝑜 , to receive information from the message

at the top of the stack.

5) When the RECEIVER completes his actions, the message

at the top of the stack is removed and the ability to send

and receive messages returns to the CALLER. The

RECEIVER may return a value to be used by the

CALLER.

6) This sequence of sending and receiving messages, viewed here as a process of stacking messages, means that each

message on the stack has a CALLER (message sender)

and RECEIVER (message receiver). Each time the

RECEIVER is finished, his message is removed from the stack and the CALLER becomes the current RECEIVER.

The now current RECEIVER can continue reading any information remaining in his message.

7) Initially, the RECEIVER is the first object in the message

typed by the programmer, who is the CALLER.

8) If the RECEIVER's message contains a eyeball, ; colon,

:, or open colon, 𝑜𝑜 , he can obtain further information

from the CALLER's message. Any information successfully obtained by the RECEIVER is no longer

available to the CALLER.

9) By calling on the object apply, the CALLER. can give the

RECEIVER the right to see all of the CALLER's remaining message. The CALLER can no longer get

information that is read by the RECEIVER; he can, however, read anything that remains after the RECEIVER

completes its actions. 10) There are two further special Smalltalk symbols useful in

sending and receiving messages. One is the keyhole, ,

that lets the RECEIVER “peek” at the message. It is the

same as the 𝑜𝑜 except it does not remove the information

from the message. The second symbol is the hash mark, #, placed in the message in order to send a reference to the

next token rather than the token itself. 61 Bounded nondeterminism may at first seem like a rather

esoteric property that is of no practical interest. However, this

turns out not to be the case. See below.

settle. Arbiters are used in computers to deal with the

circumstance that computer clocks operate asynchronously

with input from outside, e.g. keyboard input, disk access,

network input, etc. So it could take an unbounded time for

a message sent to a computer to be received and in the

meantime the computer could traverse an unbounded

number of states.62 Thus computers have the property of

unbounded nondeterminism. So there is an inconsistency

between the nondeterministic state model of computation

and the circuit model of arbiters.63

Actors [Hewitt, Bishop, and Steiger 1973] was a new

model of computation based on message passing in which

there is no global state and unbounded nondeterminism is

modeled. Furthermore, unbounded nondeterminism is a

fundamental property of the Actor Model because it

provides a guarantee of service for shared resources. In

previous models of computation with bounded

nondeterminism, it was possible for a request to a shared

resource to never receive service because it was possible

that a nondeterministic choice would always be made to

service another request instead.

Computation is not subsumed by logical deduction

The notion of computation has been evolving for a long

time. One of the earliest examples was Euclid‟s GCD

algorithm. Next came mechanical calculators of various

kinds. These notions were formalized in the Turing

Machines, the lambda calculus, etc. paradigm that focused

on the “state” of a computation that could be logically

inferred from the “previous” state.

The invention of digital computers caused a decisive paradigm shift when the notion of an interrupt was invented so that input that arrived asynchronously from outside could be incorporated in an ongoing computation. The break was decisive because asynchronous communication cannot be implemented by Turing machines etc. because the order of arrival of messages cannot be logically inferred. Message passing has become the foundation of many-core and client-cloud computing. Kowalski developed the thesis that “computation could be subsumed by deduction” [Kowalski 1988] which he states was first proposed by Hayes [1973] in the form “Computation = controlled deduction.” [Kowalski 1979]. The Hayes-Kowalski thesis was valuable in that it motivated further research to characterize exactly which computations could be performed by Logic Programming. Contrary to the quotations (above) by Kowalski and Hayes, computation in general cannot be subsumed by deduction and contrary to the quotation (above) attributed to Hayes, computation in general is not controlled

62 Thus the computer may not be in any defined stable state for an

unbounded period of time [Hewitt 2006]. 63 Of course the same limitation applies to the Abstract State

Machine (ASM) model [Blass, Gurevich, Rosenzweig, and

Rossman 2007a, 2007b; Glausch and Reisig 2006]. In the

presence of arbiters, the global states in ASM are mythical.

Page 19: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 19 of 36

deduction. In fact, Logic Programming is not computationally universal as explained below.

Arrival order indeterminacy

Hewitt and Agha [1991] and other published work argued that mathematical models of concurrency did not determine particular concurrent computations as follows: The Actor Model64 makes use of arbitration for determining which message is next in the arrival order of an Actor that is sent multiple messages concurrently. For example Arbiters can be used in the implementation of the arrival order of messages sent to an Actor which are subject to indeterminacy in their arrival order. Since arrival orders are in general indeterminate, they cannot be deduced from prior information by mathematical logic alone. Therefore mathematical logic cannot implement concurrent computation in open systems. In concrete terms for Actor systems, typically we cannot observe the details by which the arrival order of messages for an Actor is determined. Attempting to do so affects the results and can even push the indeterminacy elsewhere. Instead of observing the internals of arbitration processes of Actor computations, we await outcomes. Indeterminacy in arbiters produces indeterminacy in Actors. The reason that we await outcomes is that we have no alternative because of indeterminacy. It is important to be clear about the basis for the published claim about the limitation of mathematical logic. It was not that individual Actors could not in general be

64 Actors are the universal primitives of concurrent computation.

Process calculi (e.g. [Milner 1993]) are closely related the

Actor model. There are many similarities between the two

approaches, but also several differences (some philosophical, some technical):

There is only one Actor model (although it has numerous

formal systems for design, analysis, verification, modeling, etc.); there are numerous process calculi,

developed for reasoning about a variety of different kinds of concurrent systems at various levels of detail (including

calculi that incorporate time, stochastic transitions, or constructs specific to application areas such as security

analysis).

The Actor model was inspired by the laws of physics and depends on them for its fundamental axioms, i.e. physical

laws (see Actor model theory); the process calculi were originally inspired by algebra [Milner 1993].

Processes in the process calculi are anonymous, and

communicate by sending messages either through named channels (synchronous or asynchronous), or via ambients

(which can also be used to model channel-like communications [Cardelli and Gordon 1998]). In contrast,

actors in the Actor model possess an identity, and

communicate by sending messages to the mailing

addresses of other actors (this style of communication can also be used to model channel-like communications).

The publications on the Actor model and on process calculi

have a fair number of cross-references, acknowledgments, and

reciprocal citations.

implemented in mathematical logic. The claim is that because of the indeterminacy of the physical basis of communication in the Actor model, no kind of inferential mathematical logic can deduce the order or arrival of future messages and the resulting computational steps.

Computational Representation Theorem

What does the mathematical theory of Actors have to say about this? A closed system is defined to be one which does not communicate with the outside. Actor model theory provides the means to characterize all the possible computations of a closed system in terms of the Computational Representation Theorem [Clinger 1982; Hewitt 2006]:

The denotation DenoteS of a closed system S represents all the possible behaviors of S as

DenoteS = ⊔iω ProgressionSi(⊥S)

where ProgressionS is an approximation function that takes a set of partial behaviors to their next stage and ⊥S is the initial behavior of S.

In this way, the behavior of S can be mathematically

characterized in terms of all its possible behaviors

(including those involving unbounded nondeterminism).

Although DenoteS is not an implementation of S, it can be

used to prove a generalization of the Church-Turing-

Rosser-Kleene thesis [Kleene 1943]:

Enumeration Theorem: If the primitive Actors of a

closed Actor System S are effective, then the possible

outputs of S are recursively enumerable.

Proof: Follows immediately from the

Representation Theorem.

The upshot is that concurrent systems can be

represented and characterized by logical deduction

but cannot be implemented. Thus, the following

practical problem arose:

How can practical programming languages be

rigorously defined since the proposal by Scott and

Strachey [1971] to define them in terms lambda

calculus failed because the lambda calculus cannot

implement concurrency?

One solution is to develop a concurrent variant of the

Lisp meta-circular definition [McCarthy, Abrahams,

Edwards, Hart, and Levin 1962] that was inspired by

Turing's Universal Machine [Turing 1936]. If exp is a Lisp

expression and env is an environment that assigns values

to identifiers, then the procedure EVAL with arguments

exp and env evaluates exp using env. In the concurrent

variant, Eval<env> is a message that can be sent to exp to

cause exp to be evaluated. Using such messages, modular

Page 20: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 20 of 36

meta-circular definitions can be concisely expressed in the

Actor model for universal concurrent programming

languages [Hewitt 2009c].

Concurrency requires unbounded nondeterminism

In theoretical Computer Science, unbounded nondeterminism (sometimes called unbounded indeterminacy) is a property of concurrency by which the amount of delay in servicing a request can become unbounded as a result of arbitration of contention for shared resources while still guaranteeing that the request will eventually be serviced. Unbounded nondeterminism became an important issue in the development of the denotational semantics.

Alleged to be impossible to implement

Edsger Dijkstra [1976] argued that it is impossible to implement systems with unbounded nondeterminism although the Actor model [Hewitt, Bishop, and Steiger 1973] explicitly supported unbounded nondeterminism.

Arguments for incorporating unbounded

nondeterminism

Carl Hewitt [1985, 2006] argued against Dijkstra in support of the Actor model: There is no bound that can be placed on how long

it takes a computational circuit called an arbiter to settle. Arbiters are used in computers to deal with the circumstance that computer clocks operate asynchronously with input from outside, e.g., keyboard input, disk access, network input, etc. So it could take an unbounded time for a message sent to a computer to be received and in the meantime the computer could traverse an unbounded number of states.

Electronic mail enables unbounded nondeterminism since mail can be stored on servers indefinitely before being delivered.

Communication links to servers on the Internet can be out of service indefinitely.

Nondeterministic automata

Nondeterministic Turing machines have only bounded nondeterminism. Sequential programs containing guarded commands as the only sources of nondeterminism have only bounded nondeterminism [Dijkstra 1976] because choice nondeterminism is bounded. Gordon Plotkin [1976] gave a proof as follows:

Now the set of initial segments of execution sequences of a given nondeterministic program P, starting from a given state, will form a tree. The branching points will correspond to the choice points in the program. Since there are always only finitely many alternatives at each choice point, the branching factor of the tree is always finite. That is, the tree is finitary. Now König's lemma says that if every branch of a finitary tree is finite, then

so is the tree itself. In the present case this means that if every execution sequence of P terminates, then there are only finitely many execution sequences. So if an output set of P is infinite, it must contain a nonterminating computation.

Indeterminacy in concurrent computation versus

nondeterministic automata

Will Clinger [1981] provided the following analysis of the above proof by Plotkin:

This proof depends upon the premise that if every node x of a certain infinite branch can be reached by some computation c, then there exists a computation c that goes through every node x on the branch. ... Clearly this premise follows not from logic but rather from the interpretation given to choice points. This premise fails for arrival nondeterminism [in the arrival of messages in the Actor model] because of finite delay [in the arrival of messages]. Though each node on an infinite branch must lie on a branch with a limit, the infinite branch need not itself have a limit. Thus the existence of an infinite branch does not necessarily imply a nonterminating computation.

Bounded nondeterminism in the original version of

Communicating Sequential Processes (CSP)

Consider the following program written in CSP [Hoare 1978]:

[X :: Z!stop( ) || Y :: guard: boolean; guard := true; *[guard Z!go( ); Z?guard] || Z :: n: integer; n:= 0; continue: boolean; continue := true; *[X?stop( ) continue := false; [] Y?go( ) n := n+1; Y!continue ] ]

According to Clinger [1981]: this program illustrates global nondeterminism, since the nondeterminism arises from incomplete specification of the timing of signals between the three processes X, Y, and Z. The repetitive guarded command in the definition of Z has two alternatives: either the stop message is accepted from X, in which case continue is set to false, or a go message is accepted from Y, in which case n is incremented and Y is sent the value of continue. If Z ever accepts the stop message from X, then X terminates. Accepting the stop causes continue to be set to false, so after Y sends its next go message, Y will receive false as the value of its guard and will terminate. When both X and Y have terminated, Z terminates because it no longer has live processes providing input.

Page 21: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 21 of 36

As the author of CSP points out, therefore, if the repetitive guarded command in the definition of Z were required to be fair, this program would have unbounded nondeterminism: it would be guaranteed to halt but there would be no bound on the final value of n65

. In actual fact, the repetitive guarded commands of CSP are not required to be fair, and so the program may not halt [Hoare 1978]. This fact may be confirmed by a tedious calculation using the semantics of CSP [Francez, Hoare, Lehmann, and de Roever 1979] or simply by noting that the semantics of CSP is based upon a conventional power domain and thus does not give rise to unbounded nondeterminism.66

Since it includes the nondeterministic λ calculus, reflection, and mathematical induction in addition to its other inference capabilities, Direct Logic is a very powerful Logic Programming language.

Unbounded nondeterminism in an Actor

programming language

Nevertheless, there are concurrent programs that are not equivalent to any Direct Logic program. For example in the Actor model, the following concurrent program in ActorScriptTM [Hewitt 2009c] will return an integer of unbounded size is not equivalent to any Direct Logic expression (for reasoning see below)

Unbounded behavior Start< >

𝐈𝐧𝐭𝐞𝐠𝐞𝐫

67 when a Start message is received

let 𝐜

𝐂𝐨𝐮𝐧𝐭𝐞𝐫 = new SimpleCounter(n=0);

let c be a new SimpleCounter with count 0

{cAgain< >, return cStop< >} send an Again message to c and in parallel

return the value of

sending a Stop message to c

65 Of course, n would not survive the termination of Z and so the

value cannot actually be exhibited after termination! In the

ActorScript program below, the unbounded count is sent to the

customer of the Start< > message so that it appears externally. 66 Subsequent versions of Communicating Sequential Processes

(CSP) ([Hoare 1985; Roscoe 2005]) explicitly provide unbounded

nondeterminism. 67 The symbol begins a comment that extends to the end of the

line

SimpleCounter serializer

𝐧

𝐈𝐧𝐭𝐞𝐠𝐞𝐫 n is the current count

implements Counter68 implements the Counter interface

Again< > 𝑽𝒐𝒊𝒅

when an Again message is received

{future self Again< >, return also become (n=n+1)}

send an Again message to

this counter and in parallel return also

incrementing the count

Stop< > 𝐈𝐧𝐭𝐞𝐠𝐞𝐫

when a Stop message is received return n return the count

By the semantics of the Actor model of computation

[Clinger 1981> [Hewitt 2006], the result of evaluating the

expression UnboundedStart< > is an integer of

unbounded size.

Bounded Nondeterminism of Direct Logic

But there is no Direct Logic expression that is equivalent to

UnboundedStart< > for the following reason:

An expression will be said to always converge (written

as ) if and only if every reduction path terminates. I.e.,

there is no function f(Expressions) such that

f(0)= and (n: f(n) f(n+1) )

where the symbol is used for reduction in the

nondeterministic λ calculus (see the Appendix). For

example (λ(x) 0 | x(x)) (λ(x) 0 | x(x))69

because there

is a nonterminating path.

68 The Counter interface can be defined as follows:

Counter subInterface ?

Counter is a subinterface of the universal interface ?

Again< > 𝑽𝒐𝒊𝒅

an Again<> message returns Void

Stop< > 𝐈𝐧𝐭𝐞𝐠𝐞𝐫

a Stop<> message returns an Integer 69 Note that there are two bodies (separated by “|”) in each of the

λ expressions which provides for nondeterminism.

Page 22: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 22 of 36

Theorem: Bounded Nondeterminism of Direct Logic. If

an expression in Direct Logic always converges, then

there is a bound Bound on the number of values to

which it can converge. I.e.,

n: (n ⇨ n≤Bound)

Consequently there is no Direct Logic program equivalent

to UnboundedStart< > because it has unbounded

nondeterminism whereas every Direct Logic program has

bounded nondeterminism.

In this way we have proved that the Procedural

Embedding of Knowledge paradigm is strictly more

general than the Logic Programming paradigm.

Scientific Community Metaphor

Building on the Actor model of concurrent computation,

Kornfeld and Hewitt [1981] developed fundamental

principles for Logic Programming in the Scientific

Community Metaphor [Hewitt 2006 2008b]:

Monotonicity: Once something is published it

cannot be undone. Scientists publish their results so

they are available to all. Published work is collected

and indexed in libraries. Scientists who change their

mind can publish later articles contradicting earlier

ones. However, they are not allowed to go into the

libraries and “erase” old publications.

Concurrency: Scientists can work concurrently,

overlapping in time and interacting with each other.

Commutativity: Publications can be read regardless

of whether they initiate new research or become

relevant to ongoing research. Scientists who become

interested in a scientific question typically make an

effort to find out if the answer has already been

published. In addition they attempt to keep abreast

of further developments as they continue their work.

Sponsorship: Sponsors provide resources for

computation, i.e., processing, storage, and

communications. Publication and subscription

require sponsorship although sometimes costs can

be offset by advertising.

Pluralism: Publications include heterogeneous,

overlapping and possibly conflicting information.

There is no central arbiter of truth in scientific

communities.

Skepticism: Great effort is expended to test and

validate current information and replace it with

better information.

Provenance: The provenance of information is

carefully tracked and recorded.

Initial experiments implementing the Scientific

Community Metaphor revolved around the development of

a programming language named Ether that had procedural

plans to process goals and assertions concurrently and

dynamically created new plans during program execution

[Kornfeld and Hewitt 1981]. Ether also addressed issues of

conflict and contradiction with multiple sources of

knowledge and multiple viewpoints.

Ether used viewpoints to relativise information in

publications. However a great deal of information is shared

across viewpoints. So Ether made use of inheritance so that

information in a viewpoint could be readily used in other

viewpoints. Sometimes this inheritance is not exact as

when the laws of physics in Newtonian mechanics are

derived from those of Special Relativity. In such cases,

Ether used translation instead of inheritance building on

work by Imre Lakatos [1976] who studied very

sophisticated kinds of translations of mathematical

theorems (e.g., the Euler formula for polyhedra). Later

Bruno Latour [1988] analyzed translation in scientific

communities.

Viewpoints were used to implement natural deduction

(Fitch [1952]) in Ether. In order to prove a goal of the form

├V (P ⇨ Q) for a viewpoint V, it is sufficient to create a

new viewpoint V' that inherits from V, assert ├V’ P, and

then prove ├V’ Q. Hierarchical viewpoints of this kind

were introduced into Planner-like languages in the context

mechanism of QA-4 [Rulifson, Derksen, and Waldinger

1973].

Resolving issues among viewpoints requires negotiation

as studied in the sociology and philosophy of science.

The admission of logical powerlessness

Descartes [1644] put forward the thesis that reflection conveys power, specifically the power of existence, as in “I think, therefore I am.”70 Reflection conveys ability for large software systems to reason about the possible outcomes of their actions. However reflection comes with logical limitations including the following

Admissibility. It may not be safe to use reflection on propositions (about outcomes) that are not admissible.

Incompleteness. It may be impossible to logically prove or disprove outcomes.

Undecidability. Outcomes may be recursively undecidable.

Strong Paraconsistency. There are typically good arguments for both sides of contradictory conclusions.

Necessary Inconsistency. An unstratified reflective strongly paraconsistent theory of Direct Logic is necessarily inconsistent.

Concurrency. Other concurrently operating system components may block, interfere with, or revert possible outcomes.

70 From the Latin, “Cogito ergo sum.”

Page 23: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 23 of 36

Indeterminacy. Because of concurrency, the outcomes may be physically indeterminate.

Entanglement. The very process of reflection about possible outcomes can affect the outcomes.

Partiality. There might not be sufficient information or resources available to infer outcomes.

Nonuniversality. Logic Programs are not computationally universal because they cannot implement some concurrent programs.

These limitations lead to an admission of logical powerlessness:

In general, a component of a large software system is logically powerless over the outcome of its actions.

This admission of powerlessness needs to become part of the common sense of large software systems.71

Work to be done

There is much work to be done to further develop Direct

Logic:

The consistency of the semi-classical fragment of

Direct Logic needs to be proved relative to the

consistency of classical mathematics.72 The decidability of the Variable-free Fragment73 of

Direct Logic needs to be settled. As remarked

above, the Boolean Fragment is very close to R-

Mingle (which is decidable).

Strong Paraconsistency of reflective theories of

Direct Logic needs to be formally defined and

proved.

Church remarked as follows concerning a

Foundation of Logic that he was developing: Our present project is to develop the consequences

of the foregoing set of postulates until a contradiction is obtained from them, or until the development has been carried so far consistently as

to make it empirically probable that no contradiction can be obtained from them. And in

this connection it is to be remembered that just such

empirical evidence, although admittedly

inconclusive, is the only existing evidence of the freedom from contradiction of any system of

mathematical logic which has a claim to adequacy. [Church 1933]74

71 Admission of powerlessness is the beginning of Step 1 in 12-

step programs of recovery from addiction, first developed by

Alcoholics Anonymous, e.g., see Wilson [1952]. 72 E.g., using techniques like those in Feferman [2000]. 73 including the non-Boolean ├

T

74 The difference between the time that Church wrote the above

and today is that the standards for adequacy have gone up

dramatically. Direct Logic must be adequate to the needs of

reasoning about large software systems. Reification reflection is

one of the biggest challenges to proving that Direct Logic is

Direct Logic is in a similar position except that the

task is to demonstrate strong paraconsistency

instead of consistency. Also Direct Logic has

overcome many of the problems of Church‟s

Foundation of Logic.

Inconsistencies such as the one about ├T

ParadoxT

are relatively benign in the sense that they lack

significant consequences to software engineering.

Other propositions such as ├T

1=0 are more

malignant because it can be used to paraconsistently

infer that all integers are equal to 0. To address

malignant propositions, deeper investigations of

provability using ╟T

75 must be undertaken.

Tooling for Direct Logic needs to be developed to

support large software systems.

Conclusion

We are now approaching the half century mark of the Logicist Programme for Artificial Intelligence that was initiated by McCarthy. It has been a fascinating adventure full of twists and turns!

Logicists are now challenged as to whether they agree that

Strong Paraconsistency is the norm.

Unstratified inference and reflection are the norm.

Logic Programming is not computationally universal.

A number of Logicists feel threatened by the results in this paper.

Some would like to stick with just classical logic and not consider strong paraconsistency.76

strongly paraconsistent. Furthermore, reification reflection seems to be an insurmountable barrier to developing a set theoretic

model for Direct Logic. 75 ╟

T means that is a proof of in T

76 In 1994, Alan Robinson noted that he has “always been a little

quick to make adverse judgments about what I like to call „wacko logics‟ especially in Australia…I conduct my affairs as though I

believe … that there is only one logic. All the rest is variation in what you‟re reasoning about, not in how you‟re reasoning … [Logic] is immutable.” (quoted in Mackenzie [2001] page 286)

On the other hand Richard Routley noted: … classical logic bears a large measure of responsibility for

the growing separation between philosophy and logic which there is today… If classical logic is a modern tool

inadequate for its job, modern philosophers have shown a

classically stoic resignation in the face of this inadequacy.

They have behaved like people who, faced with a device, designed to lift stream water, but which is so badly designed

that it spills most of its freight, do not set themselves to the design of a better model, but rather devote much of their

energy to constructing ingenious arguments to convince

Page 24: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 24 of 36

Some would like to stick with the Tarskian stratified theories and not consider unstratified inference and reflection.

Some would like to stick with just Logic Programming (e.g. nondeterministic Turing Machines, λ calculus, etc.) and not consider concurrency.

And some would like to have nothing to do with any of the above! However, the results in this paper (and the driving technological and economic forces behind them) tend to push towards strong paraconsistency, unstratified inference and reflection, and concurrency. The requirements of large software systems are pushing towards strong paraconsistency and unstratified inference and reflection while Web Services and many-core architectures are pushing towards concurrency. [Hewitt 2008a]

Software engineers for large software systems often have good arguments (proofs) for some proposition P and also good arguments (proofs) for the negation of P, which is troubling. So what do large software manufacturers do? If the problem is serious, they bring it before a committee of stakeholders to try and sort it out. In many particularly difficult cases the resulting decision has been to simply live with the problem for a while. Consequently, large software systems are shipped to customers with thousands of known inconsistencies of varying severity. The challenge is to try to keep the situation from getting worse as systems continue to increase in complexity.

A big advantage of strongly paraconsistent logic is that it makes fewer mistakes than classical logic when dealing with inconsistent theories. Since software engineers have to deal with theories chock full of inconsistencies, strong paraconsistency should be attractive. However, to make it relevant we need to provide them with tools that are cost effective.

This paper develops a very powerful formalism (called Direct Logic) that incorporates the mathematics of Computer Science and allows unstratified inference and reflection for almost all of classical logic to be used in strongly paraconsistent theories in a way that is suitable for Software Engineering. Direct Logic allows unstratified direct and indirect mutual reference among use cases, documentation, and code thereby overcoming the limitations of the traditional assumption of hierarchical metatheories .

Gödel first formalized and proved that it is not possible to decide all mathematical questions by inference in his 1st incompleteness theorem. However, the incompleteness theorem (as generalized by Rosser) relies on the assumption of consistency! This paper proves a generalization of the Gödel/Rosser incompleteness

themselves that the device is admirable, that they do not

need or want the device to deliver more water; that there is

nothing wrong with wasting water and that it may even be

desirable; and that in order to “improve” the device they would have to change some features of the design, a thing

which goes totally against their engineering intuitions and which they could not possibly consider doing. [Routley

2003]

theorem: a theory in Direct Logic is incomplete. However, there is a further consequence. Although the semi-classical mathematical fragment of Direct Logic is evidently consistent, since the Gödelian paradoxical proposition is self-provable, every theory in Direct Logic is inconsistent!77 The mathematical exploration of diagonalization and reflection has been through Eubulides [4th century BC], Cantor [1890], Zermelo [1908], Russell [1908], Gödel [1931], Rosser [1936], Turing [1936], Curry [1942], Löb [1955], etc. leading ultimately to logically necessary inconsistency.

The concept of TRUTH has already been hard hit by the

pervasive inconsistencies of large software systems.

Accepting the necessary logical inconsistency of reflective

strongly paraconsistent theories would be another nail in its

coffin. Ludwig Wittgenstein (ca. 1939) said “No one has

ever yet got into trouble from a contradiction in logic.” to

which Alan Turing responded “The real harm will not

77

Why did Gödel and the logicians who followed him not go in

this direction? Feferman [2006b] remarked on “the shadow of

Hilbert that loomed over Gödel from the beginning to the end of

his career.” Also Feferman [2006a] conjectured that “Gödel

simply found it galling all through his life that he never received

the recognition from Hilbert that he deserved.” Furthermore, Feferman maintained that “the challenge remained well into his

last decade for Gödel to demonstrate decisively, if possible, why it is necessary to go beyond Hilbert‟s finitism in order to

prosecute the constructive consistency program.” Indeed Gödel saw his task as being “to find a consistency proof for arithmetic

based on constructively evident though abstract principles”

[Dowson 1997 pg. 263].

Also Gödel was a committed Platonist, which has an

interesting bearing on the issue of the status of reflection. Gödel

invented arithmetization to encode abstract mathematical

propositions as integers. Direct Logic provides a similar way to

easily formalize and paraconsistently prove Gödel‟s argument.

But it is not clear that Direct Logic is fully compatible with Gödel‟s Platonism

With an argument just a step away from inconsistency, Gödel (with his abundance of caution [Feferman 1984b, Dawson 1997])

could not conceive going in that direction. In fact, you could argue that he set up his whole hierarchical framework of

metatheories and object theories to avoid inconsistency. A

Platonist of his kind could argue that Direct Logic is a mistaken

formalism because, in Direct Logic, all strongly paraconsistent

reflective theories are inconsistent. In this view, the inconsistency

simply proves the necessity of the hierarchy of metatheories and

object theories. However, reasoning about large software systems

is made more difficult by attempting to develop such a hierarchy

for the chock full of inconsistencies theories that use reflection for code, use cases, and documentation. In this context, it is not

especially bothersome that theories of Direct Logic are

inconsistent about ├T

ParadoxT

.

On the other hand, Wittgenstein was more prepared to consider

the possibility of this inconsistency [Wittgenstein 1978].

According to Priest [2004], in 1930 Wittgenstein remarked: Indeed, even at this stage, I predict a time when there will be

mathematical investigations of calculi containing contradictions, and people will actually be proud of having

emancipated themselves from consistency.

Page 25: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 25 of 36

come in unless there is an application, in which case a

bridge may fall down.”[Holt 2006] It seems that we may

now have arrived at the remarkable circumstance that we

can‟t keep our systems from crashing without allowing

contradictions into our logic! This paper also proves that Logic Programming is not

computationally universal in that there are concurrent

programs for which there is no equivalent in Direct Logic.

Thus the Logic Programming paradigm is strictly less

general than the Procedural Embedding of Knowledge

paradigm.

Of course the results of this paper do not diminish the importance of logic.78 There is much work to be done!79

Our everyday life is becoming increasingly dependent on large software systems. And these systems are becoming increasingly permeated with inconsistency, reflection and concurrency. As these strongly paraconsistent reflective concurrent systems become a major part of the environment

in which we live, it becomes an issue of common sense how to use them effectively. We will need sophisticated software systems to help people understand and apply the principles and practices suggested in this paper. Creating this software is not a trivial undertaking!

Acknowledgements

Sol Feferman, Mike Genesereth, David Israel, Bill Jarrold,

Ben Kuipers, Pat Langley, Vladimir Lifschitz, Frank

McCabe, John McCarthy, Fanya S. Montalvo, Peter

Neumann, Ray Perrault, Natarajan Shankar, Mark Stickel,

Richard Waldinger, and others provided valuable feedback

78 In a similar way, the incompleteness theorems did not diminish

the importance of logic although they also caused concern among

some Logicists. For example Paul Bernays (David Hilbert‟s

assistant) wrote “I was doubtful already sometime before [1931] about the completeness of the formal system [for number theory],

and I uttered [my doubts] to Hilbert, who was much angry … Likewise he was angry at Gödel‟s results.” (quoted in Dawson

[1998])

In fact, Hilbert never became reconciled with incompleteness

as evidenced by the last two paragraphs of Hilbert's preface to

[Hilbert and Bernays 1934] (translation by Wilfried Sieg):

“This situation of the results that have been achieved thus far in proof theory at the same time points the direction for

the further research with the end goal to establish as consistent all our usual methods of mathematics.

With respect to this goal, I would like to emphasize the following: the view, which temporarily arose and which maintained that certain recent results of Gödel show that my

proof theory can't be carried out, has been shown to be erroneous. In fact that result shows only that one must

exploit the finitary standpoint in a sharper way for the farther reaching consistency proofs.”

79 In the film Dangerous Knowledge [Malone 2006], explores the

history of previous crises in the foundations for the logic of

knowledge focusing on the ultimately tragic personal outcomes

for Cantor, Boltzmann, Gödel, and Turing.

at seminars at Stanford, SRI, and UT Austin to an earlier

version of the material in this paper. For the AAAI Spring

Symposium‟06, Ed Feigenbaum, Mehmet Göker, David

Lavery, Doug Lenat, Dan Shapiro, and others provided

valuable feedback. At MIT Henry Lieberman, Ted Selker,

Gerry Sussman and the members of Common Sense

Research Group made valuable comments. Reviewers for

AAMAS ‟06 and „07, KR‟06, COIN@AAMAS‟06 and

IJCAR‟06 made suggestions for improvement.

In the logic community, Mike Dunn, Sol Feferman,

Mike Genesereth, Tim Hinrichs, Mike Kassoff, John

McCarthy, Chris Mortensen, Graham Priest, Dana Scott,

Richard Weyhrauch and Ed Zalta provided valuable

feedback

Dana Scott made helpful suggestions on reflection and

incompleteness. Richard Waldinger provided extensive

suggestions that resulted in better focusing a previous

version of this paper and increasing its readability. Sol

Feferman reminded me of the connection between

Admissibility and 1. Discussion with Pat Hayes and Bob

Kowalski provided insight into the early history of Prolog.

Communications from John McCarthy and Marvin Minsky

suggested making common sense a focus. Mike Dunn

collaborated on looking at the relationship of the Boolean

Fragment of Direct Logic to R-Mingle. Greg Restall

pointed out that Direct Logic does not satisfy some

Relevantist principles. Gerry Allwein and Jeremy Forth

made detailed comments and suggestions for improvement.

Bob Kowalski and Erik Sandewall provided helpful

pointers and discussion of the relationship with their work.

Discussions with Ian Mason and Tim Hinrichs helped me

develop Löb‟s theorem for Direct Logic. Scott Fahlman

suggested introducing the roadmap in the introduction of

the paper. At CMU, Wilfried Sieg introduced me to his

very interesting work with Clinton Field on automating the

search for proofs of the Gödel incompleteness theorems.

Also at CMU, I had productive discussions with Jeremy

Avigad, Randy Bryant, John Reynolds, Katia Sycara, and

Jeannette Wing. At my MIT seminar and afterwards,

Marvin Minsky, Ted Selker, Gerry Sussman, and Pete

Szolovits made helpful comments. Les Gasser, Mike

Huhns, Victor Lesser, Pablo Noriega, Sascha Ossowski,

Jaime Sichman, Munindar Singh, etc. provided valuable

suggestions at AAMAS‟07. I had a very pleasant dinner

with Harvey Friedman at Chez Panisse after his 2nd Tarski

lecture.

Jeremy Forth, Tim Hinrichs, Fanya S. Montalvo, and

Richard Waldinger provided helpful comments and

suggestions on the logically necessary inconsistencies in

theories of Direct Logic. Rineke Verbrugge provided

valuable comments and suggestions at MALLOW‟07.

Mike Genesereth and Gordon Plotkin kindly hosted my

lectures at Stanford and Edinburgh, respectively, on “The

Logical Necessity of Inconsistency”. Inclusion of Cantor‟s

diagonal argument as motivation as well as significant

improvements in the presentation of the incompleteness

Page 26: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 26 of 36

and inconsistency theorems were suggested by Jeremy

Forth. John McCarthy pointed to the distinction between

Logic Programming and the Logicist Programme for

Artificial Intelligence. Reviewers at JAIR made useful

suggestions. Mark S. Miller made important suggestions

for improving the meta-circular definition of ActorScript.

Comments by Michael Beeson helped make the

presentation of Direct Logic more rigorous. Conversations

with Jim Larson helped clarify the relationship between

classical logic and the logic of paraconsistent theories. An

anonymous referee of the Journal of Logic and

Computation made a useful comment. John-Jules Meyer

and Albert Visser provided helpful advice and suggestions.

References

Hal Abelson and Gerry Sussman Structure and Interpretation of Computer Programs 1984.

Luca Aceto and Andrew D. Gordon (editors). Algebraic Process Calculi: The First Twenty Five Years and Beyond Bertinoro, Italy, August, 2005.

Sanjaya Addanki, Roberto Cremonini, and J. Scott Penberthy. “Reasoning about assumptions in graphs of models” Readings in Qualitative Reasoning about Physical Systems. Kaufman. 1989.

Gul Agha. Actors: A Model of Concurrent Computation in Distributed Systems Doctoral Dissertation. 1986.

Gul Agha, Ian Mason, Scott Smith, and Carolyn Talcott. “A foundation for Actor computation.” Journal of Functional Programming. 1997.

Bruce Anderson. “Documentation for LIB PICO-PLANNER” School of Artificial Intelligence, Edinburgh University. 1972.

Alan Anderson and Nuel Belnap, Jr. (1975) Entailment: The Logic of Relevance and Necessity Princeton University Press.

Robert Anderson and Woody Bledsoe (1970) “A Linear Format for Resolution with Merging and a New Technique for Establishing Completeness” JACM 17.

Aldo Antonelli (2006). “Non-monotonic Logic” Stanford Encyclopedia of Philosophy. March 2006.

A. I. Arruda. “Aspects of the historical development of

paraconsistent logic” In Paraconsistent Logic: Essays on

the Inconsistent Philosophia Verlag. 1989

William Athas and Nanette Boden “Cantor: An Actor

Programming System for Scientific Computing”

Proceedings of the NSF Workshop on Object-Based

Concurrent Programming. 1988. Special Issue of

SIGPLAN Notices.

Henry Baker. Actor Systems for Real-Time Computation

MIT EECS Doctoral Dissertation. January 1978.

Henry Baker and Carl Hewitt: Laws for Communicating

Parallel Processes IFIP. August 1977.

Henry Baker and Carl Hewitt “The Incremental Garbage

Collection of Processes.” Symposium on Artificial

Intelligence Programming Languages. SIGPLAN

Notices. August 1977. “

Bob Balzer. “Tolerating Inconsistency” 13th International Conference on Software Engineering. 1991.

Bruce Baumgart. “Micro-Planner Alternate Reference Manual” Stanford AI Lab Operating Note No. 67, April 1972.

JC Beall and Greg Restall. Logical Pluralism Oxford University Press. 2006.

Michael Beeson. “Lambda Logic” Lecture Notes in Artificial Intelligence 3097. Springer. 2004.

Leopoldo Bertossi, et al., eds. Inconsistency Tolerance Springer. 2004.

Philippe Besnard and Anthony Hunter. “Quasi-classical Logic: Non-trivializable classical reasoning from inconsistent information” Symbolic and Quantitative Approaches to Reasoning and Uncertainty 1995.

Philippe Besnard and Torsten Schaub. “Significant Inferences: Preliminary Report. http://www.cs.uni-potsdam.de/wv/pdfformat/bessch00a.pdf

Jean-Yves Bėziau, Walter Carnielli, and Dov Gabbay. Ed. Handbook of Paraconsistency. College Publications Kings College London. 2007

Fisher Black. A deductive question answering system, Harvard University Thesis. 1964.

Simon Blackburn and Keith Simmons (1999) Truth Oxford University Press.

H. Blair and V. S. Subrahmanian. “Paraconsistent Logic Programming”. Theoretical Computer Science, 68(2) 1989.

Patricia Blanchette “The Frege-Hilbert Controversy” The Stanford Encyclopedia of Philosophy December 7, 2007.

Andreas Blass, Yuri Gurevich, Dean Rosenzweig, and Benjamin Rossman (2007a) Interactive small-step algorithms I: Axiomatization Logical Methods in Computer Science. 2007.

Andreas Blass, Yuri Gurevich, Dean Rosenzweig, and Benjamin Rossman (2007b) Interactive small-step algorithms II: Abstract state machines and the characterization theorem. Logical Methods in Computer Science. 2007.

George Boole. An Investigation of the Laws of Thought 1853. http://www.gutenberg.org/etext/15114

Geof Bowker, Susan L. Star, W. Turner, and Les Gasser, (Eds.) Social Science Research, Technical Systems and Cooperative Work Lawrence Earlbaum. 1997.

Robert Boyer (1971) Locking: A Restriction of Resolution Ph. D. University of Texas at Austin.

Fisher Black. A Deductive Question Answering System Harvard University. Thesis. 1964.

Daniel Bobrow and Bertram Raphael. “New programming languages for Artificial Intelligence research” ACM Computing Surveys. 1974.

Jean-Pierre Briot. From objects to actors: Study of a limited symbiosis in Smalltalk-80 Rapport de Recherche 88-58, RXF-LITP, Paris, France, September 1988.

Maurice Bruynooghe, Luís Moniz Pereira, Jörg Siekmann, Maarten van Emden. “A Portrait of a Scientist as a Computational Logician” Computational Logic: Logic

Page 27: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 27 of 36

Programming and Beyond: Essays in Honour of Robert A. Kowalski, Part I Springer. 2004.

Andrea Cantini “Paradoxes and Contemporary Logic” The Stanford Encyclopedia of Philosophy October 16, 2007.

George Cantor .“Diagonal Argument” German Mathematical Union (Deutsche Mathematiker-Vereinigung) (Bd. I, S. 75-78 ) 1890-1.

Rudolph Carnap. Logische Syntax der Sprache. (The Logical Syntax of Language Open Court Publishing 2003) 1934.

Lewis Carroll “What the Tortoise Said to Achilles” Mind 4. No. 14. 1895.

Lewis Carroll Through the Looking-Glass Macmillan. 1871

Carlo Cellucci “Gödel's Incompleteness Theorem and the Philosophy of Open Systems” Kurt Gödel: Actes du Colloque, Neuchâtel 13-14 juin 1991, Travaux de logique N. 7, Centre de Recherches Sémiologiques, University de Neuchâtel. http://w3.uniroma1.it/cellucci/documents/Goedel.pdf

Carlo Cellucci “The Growth of Mathematical Knowledge: An Open World View” The growth of mathematical knowledge Kluwer. 2000.

Alonzo Church “A Set of postulates for the foundation of logic (1)” Annals of Mathematics. Vol. 33, 1932.

Alonzo Church “A Set of postulates for the foundation of logic (2)” Annals of Mathematics. Vol. 34, 1933.

Alonzo Church The Calculi of Lambda-Conversion Princeton University Press. 1941.

Will Clinger. Foundations of Actor Semantics MIT Mathematics Doctoral Dissertation. June 1981.

Alain Colmerauer and Philippe Roussel. “The birth of Prolog” History of Programming Languages ACM Press. 1996

F. S. Correa da Silva, J. M. Abe, and M. Rillo. “Modeling Paraconsistent Knowledge in Distributed Systems”. Technical Report RT-MAC-9414, Instituto de Matematica e Estatistica, Universidade de Sao Paulo, 1994.

James Crawford and Ben Kuipers. “Negation and proof by contradiction in access-limited logic.” AAAI-91.

Haskell Curry “Some Aspects of the Problem of Mathematical Rigor” Bulletin of the American Mathematical Society Vol. 4. 1941.

Haskell Curry. “The combinatory foundations of mathematics” Journal of Symbolic Logic. 1942.

Michael Cusumano and Richard Selby, R. Microsoft Secrets: How the World‟s Most Powerful Software Company Creates Technology, Shapes Markets, and Manages People. Free Press. 1995

Newton da Costa Inconsistent Formal Systems (Sistemas Formais Inconsistentes in Portuguese) Doctoral dissertation. University of Paraná. 1963.

Ole-Johan Dahl and Kristen Nygaard. “Class and subclass declarations” IFIP TC2 Conference on Simulation Programming Languages. May 1967.

Julian Davies. “Popler 1.5 Reference Manual” University of Edinburgh, TPU Report No. 1, May 1973.

Ernest Davis. “The Naïve Physics Perplex” AI Magazine, Winter 1998.

Ernest Davis and Leora Morgenstern. “A First-Order Theory of Communication and Multi-Agent Plans” Journal of Logic and Computation, Vol. 15, No. 5, 2005.

John Dawson (1997) Logical Dilemmas. The Life and Work of Kurt Gödel AK Peters.

John Dawson. “What Hath Gödel Wrought?” Synthese. Jan. 1998.

Walter Dean and Hdenori Kurokawa. “Knowledge, proof, and the Knower” TARK‟09,

Richard Dedekind (1888) “What are and what should the numbers be?” (Translation in From Kant to Hilbert: A Source Book in the Foundations of Mathematics. Oxford University Press. 1996) Braunschweig.

Reny Descartes (1644) Principles of Philosophy (English translation in The Philosophical Writings of Descartes Cambridge University Press 1985).

Edsger Dijkstra. A Discipline of Programming. Prentice Hall. 1976.

Mike Dunn and Greg Restall. “Relevance Logic” in The Handbook of Philosophical Logic, second edition. Dov Gabbay and Franz Guenther (editors), Kluwer. 2002.

Ralph Waldo Emerson. “Self Reliance “ Essays—First Series. 1841.

Euclid. The Thirteen Books of Euclid's Elements. (3 Vol. translated by Thomas Heath. Cambridge University Press. 1925). Circa 300BC.

Scott Fahlman. A Planning System for Robot Construction Tasks MIT AI TR-283. June 1973.

Adam Farquhar, Anglela Dappert, Richard Fikes, and Wanda Pratt. “Integrating Information Sources Using Context” Logic Knowledge Systems Laboratory. KSL-95-12. January, 1995.

Solomon Feferman (1984a) “Toward Useful Type-Free Theories, I” in Recent Essays on Truth and the Liar Paradox. Ed. Robert Martin (1991) Claraendon Press.

Solomon Feferman (1984b) “Kurt Gödel: Conviction and Caution” Philosophia Naturalis Vol. 21.

Solomon Feferman (1991) "Reflecting on incompleteness", Journal of Symbolic Logic.

Solomon Feferman (1998) In the Light of Logic Oxford University Press.

Solomon Feferman (2000) “Does reductive proof theory have a viable rationale?” Erkenntnis 53.

Solomon Feferman (2004) “Tarski‟s Conceptual Analysis for Semantical Notions” Sémantique et Épistémologie http://math.stanford.edu/~feferman/papers/conceptanalysco.pdf

Solomon Feferman (2006a) “The nature and significance of Gödel's incompleteness theorems” lecture for the Princeton Institute for Advanced Study Gödel Centenary Program, Nov. 17, 2006. http://math.stanford.edu/~feferman/papers/Godel-IAS.pdf

Solomon Feferman (2006b) “Lieber Herr Bernays! Lieber Herr Gödel! Gödel on finitism, constructivity and Hilbert's program” submitted version of lecture for the Gödel centenary conference, Horizons of Truth, Vienna,

Page 28: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 28 of 36

27-29 April 2006. http://math.stanford.edu/~feferman/papers/bernays.pdf

Solomon Feferman (2007a) “Axioms for determinateness and truth” http://math.stanford.edu/~feferman/papers.html

Solomon Feferman (2007b) “Gödel, Nagel, minds and machines” October 25, 2007. http://math.stanford.edu/~feferman/papers/godelnagel.pdf

Anita Burdman Feferman and Solomon Feferman (2004) Alfred Tarski: Life and Logic. Cambridge University Press. 2004.

Dieter Fensel and Frank van Harmelen. “Unifying Reasoning and Search to Web Scale” IEEE Internet Computing. March/April 2007.

Paul Feyerabend. Killing Time: The Autobiography of Paul Feyerabend. University Of Chicago Press. 1995.

Hartry Field. “A Revenge-Immune Solution to the Semantic Paradoxes.” Journal of Philosophical Logic, April 2003

Kit Fine. “Analytic Implication” Notre Dame Journal of Formal Logic. April 1986.

Frederic Fitch. Symbolic Logic: an Introduction. Ronald Press. 1952.

J.M. Foster and E.W. Elcock. (1969) “ABSYS: An Incremental Compiler for Assertions” Machine Intelligence 4. Edinburgh University Press.

Nissim Francez, Tony Hoare, Daniel Lehmann, and Willem-Paul de Roever. “Semantics of nondeterminism, concurrency, and communication” Journal of Computer and System Sciences. December 1979.

Torkel Franz[n. Inexhaustibility AK Peters. 2004 Torkel Franz[n. Gödel‟s Theorem: an incomplete guide to

its use and abuse. A K Peters. 2005. Gottlob Frege (1915) “My Basic Logical Insights”

Posthumous Writings University of Chicago Press. 1979. Kazuhiro Fuchi, Robert Kowalski, Kazunori Ueda, Ken

Kahn, Takashi Chikayama, and Evan Tick. “Launching the new era”. CACM. 1993.

Dov Gabbay (ed.) What is a Logical System? Oxford. 1994.

John Gay. “The Elephant and the Bookseller” Fifty-one Fables in Verse 1727

Michael Gelfond and Vladimir Lifschitz. “Logic programs with classical negation” International Conference on Logic Programming. MIT Press. 1990.

Gerhard Gentzen. “Provability and nonprovability of restricted transfinite induction in elementary number theory” (Collected Papers of Gerhard Gentzen. North-Holland. 1969) Habilitation thesis. Göttingen. 1942.

Gerhard Gentzen (1935) “Investigations into Logical Deduction.” (Collected Papers of Gerhard Gentzen. North-Holland. 1969)

Andreas Glausch and Wolfgang Reisig. Distributed Abstract State Machines and Their Expressive Power Informatik-Berichete 196. Humboldt University of Berlin. January 2006.

Kurt Gödel (1930) “The completeness of the axioms of the functional calculus of logic” (translated in A Source

Book in Mathematical Logic, 1879-1931. Harvard Univ. Press. 1967)

Kurt Gödel (1931) “On formally undecidable propositions of Principia Mathematica” in A Source Book in Mathematical Logic, 1879-1931. Translated by Jean van Heijenoort. Harvard Univ. Press. 1967.

Kurt Gödel (1965) “On Undecidable Propositions of Formal Mathematical Systems” (a copy of Gödel‟s 1931 paper with his corrections of errata and added notes) in The Undecidable: Basic Papers on Undecidable Propositions, Unsolvable problems and Computable Functions Martin Davis editor. Raven Press 1965.

Kurt Gödel (1944) “Russell‟s Mathematical Logic” in Philosophy of Mathematics(2nd ed.) Cambridge University Press.

Solomon Golomb and Leonard Baumert. (1965) “Backtrack Programming” JACM. Vol. 12 No. 4.

C. Cordell Green: “Application of Theorem Proving to Problem Solving” IJCAI 1969.

Steve Gregory. “Concurrent Logic Programming Before ICOT: A Personal Perspective” August 15, 2007.

http://www.cs.bris.ac.uk/~steve/papers/ALP/CLPbeforeICOT.pdf Irene Greif. Semantics of Communicating Parallel

Processes MIT EECS Doctoral Dissertation. August 1975

Ramanathan Guha. Contexts: Formalization and Some Applications PhD thesis, Stanford University, 1991.

W. D. Hart. “Skolem Redux” Notre Dame Journal of. Formal Logic. 41, no. 4. 2000.

Pat Hayes. “Computation and Deduction” Mathematical Foundations of Computer Science: Proceedings of Symposium and Summer School, Štrbské Pleso, High Tatras, Czechoslovakia. September 1973.

Pat Hayes “Some Problems and Non-Problems in Representation Theory” AISB. Sussex. July, 1974.

Pat Hayes. “The Naïve Physics Manifesto”. Expert Systems in the Microelectronic Age. Edinburgh University Pres. 1979.

Pat Hayes. 1985a. “The Second Naïve Physics Manifesto” Formal Theories of the Commonsense World. Ablex. 1985.

Pat Hayes. 1985b. “Naïve Physics 1: Ontology for Liquids” Formal Theories of the Commonsense World. Ablex. 1985.

Pat Hayes. “Contexts in context.” Contexts in Knowledge Representation and Natural Language. AAAI. 1997.

Pat Hayes. “Context Mereology.” Commonsense 2007. Joseph Heller. Catch-22. Everyman's Library. 1995. Leon Henkin “A Problem Concerning Provability” Journal

of Symbolic Logic, Vol. 17 (1952). Carl Hewitt. “Planner: A Language for Proving Theorems

in Robots” IJCAI 1969. Carl Hewitt. “Procedural Embedding of Knowledge In

Planner” IJCAI 1971. Carl Hewitt, Peter Bishop and Richard Steiger. “A

Universal Modular Actor Formalism for Artificial Intelligence” IJCAI 1973.

Page 29: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 29 of 36

Carl Hewitt and Henry Baker Laws for Communicating Parallel Processes IFIP. August 1977.

Carl Hewitt. “Viewing Control Structures as Patterns of Passing Messages” Journal of Artificial Intelligence. June 1977.

Carl Hewitt and Peter de Jong. “Open Systems”' Perspectives on Conceptual Modeling, Brodie, Mylopoulos, and Schmidt (eds.), Springer-Verlag, 1983.

Carl Hewitt. “The Challenge of Open Systems” Byte Magazine. April 1985.

Carl Hewitt (1986). “Offices Are Open Systems” ACM Transactions on Information Systems 4(3)

Carl Hewitt (1990). “Towards Open Information Systems Semantics” International Workshop on Distributed Artificial Intelligence

Carl Hewitt (1991). “Open Information Systems Semantics” Journal of Artificial Intelligence. January 1991.

Carl Hewitt and Jeff Inman. “DAI Betwixt and Between: From „Intelligent Agents‟ to Open Systems Science” IEEE Transactions on Systems, Man, and Cybernetics. Nov. /Dec. 1991.

Carl Hewitt and Gul Agha. “Guarded Horn clause languages: are they deductive and Logical?” International Conference on Fifth Generation Computer Systems. Ohmsha 1988.

Carl Hewitt. (2006). “What is Commitment? Physical, Organizational, and Social” COIN@AAMAS‟06. (Revised version to be published in Springer Verlag Lecture Notes in Artificial Intelligence. Edited by Javier Vázquez-Salceda and Pablo Noriega. 2007) April 2006.

Carl Hewitt (2007a). “Organizational Computing Requires Unstratified Paraconsistency and Reflection” COIN@AAMAS. 2007.

Carl Hewitt (2008a) “A historical perspective on developing foundations for privacy-friendly client cloud computing: The paradigm shift from „inconsistency denial‟ to „semantic integration” (Revised version of “Development of Logic Programming: What went wrong, What was done about it, and What it might mean for the future” in Proceedings of What Went Wrong and Why edited by Mehmet Gőker and Daniel Shapiro, AAAI Press. 2008 pp 1-11) ArXiv.

Carl Hewitt (2008b). “Norms and Commitment for ORGs (Organizations of Restricted Generality): Strong Paraconsistency and Participatory Behavioral Model Checking” April 28, 2008. http://normsandcommitmentfororgs.carlhewitt.info/

Carl Hewitt (2008c) “Large-scale Organizational Computing requires Unstratified Reflection and Strong Paraconsistency” Coordination, Organizations, Institutions, and Norms in Agent Systems III Jaime Sichman, Pablo Noriega, Julian Padget and Sascha Ossowski (ed.). Springer-Verlag. http://organizational.carlhewitt.info/

Carl Hewitt (2008d) “Middle History of Logic Programming” Google Knol.

Carl Hewitt (2008d). ORGs (Organizations of Restricted GeneralityTM): Strong Paraconsistency and Participatory Behavioral Model Checking Google Knol.

Carl Hewitt (2008e). ORGs for Scalable, Robust, Privacy-Friendly Client Cloud Computing IEEE Internet Computing September/October 2008.

Carl Hewitt (2009a) Perfect Disruption: The Paradigm Shift from Mental Agents to ORGs IEEE Internet Computing. Jan/Feb 2009.

Carl Hewitt (2009b) A historical perspective on developing foundations for client cloud computing: The Paradigm Shift from "Inconsistency Denial" to "Rapid Recovery" (Revised version of "Development of Logic Programming: What went wrong, What was done about it, and What it might mean for the future" AAAI Workshop on What Went Wrong. AAAI-08.) ArXiv. January 30, 2009.

Carl Hewitt (2009c) “ActorScript™: Industrial strength integration of local and nonlocal concurrency for Client-cloud Computing” ArXiv. 0907.3330

David Hilbert (1926) “Über das Unendliche” Mathematische Annalen, 95: 161-90. (“On the Infinite” English translation in van Heijenoort. 1967).

David Hilbert and Paul Bernays. Grundlagen der Mathematik I. (L'Harmattan edition 2001) 1934

Tony Hoare. “Communicating Sequential Processes” CACM August, 1978.

Tony Hoare. Communicating Sequential Processes. Prentice Hall. 1985.

Tony Hoare. “The verifying compiler: A grand challenge for computing research” JACM. January 2003.

Wilfrid Hodges (2006) “Tarski‟s Truth Definitions” Stanford Encyclopedia of Philosophy.

Douglas Hofstadter. I am a Strange Loop Basic Books. 2007.

Jim Holt. “Code-Breaker” The New Yorker February 6, 2006.

Leon Horsten “Philosophy of Mathematics” The Stanford Encyclopedia of Philosophy September 27, 2007.

Matthew Huntbach and Graem Ringwood. Agent-Oriented Programming: From Prolog to Guarded Definite Clauses Sprinter. 1999.

Daniel Ingalls. “The Evolution of the Smalltalk Virtual Machine” Smalltalk-80: Bits of History, Words of Advice. Addison Wesley. 1983.

Alan Kay. “Personal Computing” in Meeting on 20 Years of Computing Science Instituto di Elaborazione della Informazione, Pisa, Italy. 1975. http://www.mprove.de/diplom/gui/Kay75.pdf

Jussi Ketonen and Richard Weyhrauch. “A decidable fragment of Predicate Calculus” Theoretical Computer Science. 1984.

Thomas Kida. Don‟t Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking Prometheus Books. 2006.

Stephen Kleene and John Barkley Rosser “The inconsistency of certain formal logics” Annals of Mathematics Vol. 36. 1935.

Page 30: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 30 of 36

Stephen Kleene Recursive Predicates and Quantifiers American Mathematical Society Transactions. 1943

Frederick Knabe. “A Distributed Protocol for Channel-Based Communication with Choice” PARLE 1992.

Bill Kornfeld and Carl Hewitt. “The Scientific Community Metaphor” IEEE Transactions on Systems, Man, and Cybernetics. January 1981.

Bill Kornfeld. Parallelism in Problem Solving MIT EECS Doctoral Dissertation. August 1981.

Robert Kowalski “Predicate Logic as Programming Language” Memo 70, Department of Artificial Intelligence, Edinburgh University. 1973

Robert Kowalski (1979) “Algorithm = Logic + Control” CACM. July 1979.

Robert Kowalski (1986). “The limitation of logic” ACM Annual Conference on Computer Science.

Robert Kowalski (1973) “Predicate Logic as Programming Language” Memo 70, Department of AI, Edinburgh University.

Robert Kowalski 1988a. “The Early Years of Logic Programming” CACM. January 1988.

Robert Kowalski (1988b). “Logic-based Open Systems” Representation and Reasoning. Stuttgart Conference Workshop on Discourse Representation, Dialogue tableaux and Logic Programming. 1988. http://www.doc.ic.ac.uk/~rak/papers/open.pdf

Robert. Kowalski and Francesca Toni. “Abstract Argumentation” Artificial Intelligence and Law. 1996

Robert Kowalski (2006) “The Logical Way to be Artificially Intelligent.” CLIMA VI. Springer Verlag.

Robert Kowalski (2007) “What is Logic Programming?” http://en.wikipedia.org/wiki/Talk:Logic_programming#What_is_Logic_Programming.3F

Richard Kraut. “Plato” Stanford Encyclopedia of Philosophy. 2004.

Ernest Kurtz and Katherine Ketcham. The Spirituality of Imperfection: Storytelling and the Search for Meaning Bantam 1993.

Imre Lakatos. “A renaissance of empiricism in the recent philosophy of mathematics?” Mathematics, Science and Epistemology. 1978.

Imre Lakatos. Proofs and Refutations Cambridge University Press. 1976

Imre Lakatos. Mathematics, Science and Epistemology edited by J. Worrall and G. Currie. Cambridge University Press. 1978.

Peter Landin. “A Generalization of Jumps and Labels” UNIVAC Systems Programming Research Report. August 1965. (Reprinted in Higher Order and Symbolic Computation. 1998)

Bruno Latour Science in Action: How to Follow Scientists and Engineers Through Society Harvard University Press. 1988.

Hannes Leitgeb. “What theories of truth should be like (but cannot be)” Philosophy Compass 2 (2). 2007.

Doug Lenat “CYC: Lessons Learned in Large-Scale Ontological Engineering” November 17, 2005.

http://ontolog.cim3.net/file/resource/presentation/DougLenat_20051117/Cyc-DougLenat_20051117.ppt

Henry Lieberman. “A Preview of Act 1” MIT AI memo 625. June 1981.

James Lighthill. "Artificial Intelligence: A General Survey" Artificial Intelligence: a paper symposium. UK Science Research Council. 1973

Martin Löb. “Solution of a problem of Leon Henkin.” Journal of Symbolic Logic. Vol. 20. 1955.

Per Martin-Löf “Verificationism then and now” The Foundational Debate. Kluwer. 1995.

Donald Loveland. Report of a Workshop on the Future Directions of Automated Deduction NSF 1997. http://www.cs.duke.edu/AutoDedFD/report/

Leopold Löwenheim (1915) “Über Möglichkeiten im

Relativkalkül” Mathematische Annalen 76. (Translated

as “On possibilities in the calculus of relatives"”in Jean

van Heijenoort, 1967. From Frege to Gödel: A Source

Book in Mathematical Logic, 1879-1931. Harvard Univ.

Press)

Michael Lynch The Nature of Truth MIT Press. 2001.

Donald MacKenzie. Mechanizing Proof. MIT Press. 2001. Edwin Mares. “Relevance Logic” Stanford Encyclopedia

of Philosophy. Jan. 2006. David Malone (2007) Dangerous Knowledge BBC Video. http://video.google.com/videoplay?docid=-3503877302082311448

Edwin Mares. Relevant Logic Cambridge University Press. 2007

John McCarthy. “Programs with common sense” Symposium on Mechanization of Thought Processes. National Physical Laboratory. Teddington, England. 1958.

John McCarthy. “Situations, actions and causal laws” Stanford Artificial Intelligence Project: Memo 2. 1963

John McCarthy and Pat Hayes. “Some Philosophical Problems from the Standpoint of Artificial Intelligence” Machine Intelligence 4. 1969

John McCarthy, Paul Abrahams, Daniel Edwards, Timothy Hart, and Michael Levin. Lisp 1.5 Programmer‟s Manual MIT Computation Center and Research Laboratory of Electronics. 1962.

John McCarthy. “Review of „Artificial Intelligence: A General Survey” Artificial Intelligence: a paper symposium. UK Science Research Council. 1973.

John McCarthy. “Circumscription—a form of nonmonotonic reasoning.” Artificial Intelligence. 1980.

John McCarty. “Applications of circumscription to formalizing common sense knowledge” Artificial Intelligence. 1986.

John McCarthy. “Generality in Artificial Intelligence” CACM. December 1987.

John McCarthy. “A logical AI Approach to Context” Technical note, Stanford Computer Science Department, 1996.

John McCarthy. Sterile Containers September 8, 2000. http://www.ai.sri.com/~rkf/designdoc/sterile.ps

Page 31: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 31 of 36

John McCarthy. “What is Artificial Intelligence” September 1, 2007. http://www-formal.stanford.edu/jmc/whatisai/whatisai.html

L. Thorne McCarty. “Reflections on TAXMAN: An Experiment on Artificial Intelligence and Legal Reasoning” Harvard Law Review Vol. 90, No. 5, March 1977.

Drew McDermott and Gerry Sussman. “The Conniver Reference Manual” MIT AI Memo 259. May 1972.

Drew McDermott. The Prolog Phenomenon ACM SIGART Bulletin. Issue 72. July, 1980.

Vann McGee “In Praise of the Free Lunch: Why Disquotationalists Should Embrace Compositional Semantics” Self-Reference CSLI Publications. 2006.

Casey McGinnis “Paraconsistency and logical hypocrisy” The Logica Yearbook Praha. http://www.geocities.com/cnmcginnis/ParaLogHyp.pdf

George Milne and Robin Milner. “Concurrent processes and their syntax” JACM. April, 1979.

Robin Milner '”Elements of interaction: Turing award lecture'” CACM. January 1993.

Marvin Minsky (ed.) Semantic Information Processing MIT Press. 1968.

Marvin Minsky and Seymour Papert. “Progress Report on Artificial Intelligence” MIT AI Memo 252. 1971.

Marvin Minsky, Push Singh, and Aaron Sloman: “The St. Thomas Common Sense Symposium: Designing Architectures for Human-Level Intelligence” AI Magazine. Summer 2004.

Chris Mortensen. “The Validity of Disjunctive Syllogism is Not So Easily Proved.” Notre Dame Journal of Formal Logic January 1983.

Chris Mortensen. Inconsistent Mathematics Kluwer Academic Publishers. 1995.

Allen Newell and Herbert Simon. “The logic theory machine: A complex information processing system” IRE Transactions on Information Theory IT-2:61-79. 1956.

Mike Paterson and Carl Hewitt. “Comparative Schematology” MIT AI Memo 201. August 1970.

Carl Petri. Kommunikation mit Automate. Ph. D. Thesis. University of Bonn. 1962.

Gordon Plotkin. “A powerdomain construction” SIAM Journal of Computing September 1976.

George Polya (1957) Mathematical Discovery: On Understanding, Learning and Teaching Problem Solving Combined Edition Wiley. 1981.

Karl Popper(1962). Conjectures and Refutations Basic Books.

Karl Popper. (1934) Logik der Forschung, Springer. (Logic of Scientific Discovery Routledge 2002).

Graham Priest. “Dialetheism” The Stanford Encyclopedia of Philosophy (Winter 2004 Edition)

Graham Priest, and Richard Routley (1989) “The History of Paraconsistent Logic” in Paraconsistent Logic: Essays on the Inconsistent Philosophia Verlag.

Graham Priest. “Paraconsistent Logic” Handbook of Philosophical Logic Volume 6, 2nd ed. Kluwer. 2002

Graham Priest and Koji Tanaka. “Paraconsistent Logic” The Stanford Encyclopedia of Philosophy. Winter 2004.

Graham Priest. “Wittgenstein‟s Remarks on Gödel‟s Theorem” in Wittgenstein‟s Lasting Significance Routledge. 2004.

Graham Priest (2006). “60% Proof: Lakatos, Proof, and Paraconsistency” http://garnet.acns.fsu.edu/~tan02/OPC%20Week%20Three/Priest.pdf

Stephen Reed and Doug Lenat. “Mapping Ontologies into Cyc” AAAI 2002 Conference Workshop on Ontologies for the Semantic Web July 2002.

Ray Reiter. Knowledge in Action: Logical Foundations for Specifying and Implementing Dynamical Systems. MIT Press, 2001.

Greg Restall (2006). “Curry‟s Revenge: the costs of non-classical solutions to the paradoxes of self-reference” (to appear in The Revenge of the Liar ed. J.C. Beall. Oxford University Press. 2007) July 12, 2006. http://consequently.org/papers/costing.pdf

John Alan Robinson, “A Machine-Oriented Logic Based on the Resolution Principle.” CACM. 1965.

Bill Roscoe. The Theory and Practice of Concurrency Prentice-Hall. Revised 2005.

Scott Rosenberg. Dreaming in Code. Crown Publishers. 2007.

Marcus Rossberg. “Second-Order Logic” Socrates Teaching Mobility Intensive Seminar, University of Helsinki, 16-19 May, 2005. http://www.st-andrews.ac.uk/~mr30/SOL/SOL3.pdf

John Barkley Rosser. “Extensions of Some Theorems of Gödel and Church” Journal of Symbolic. Logic. 1(3) 1936.

Philippe Rouchy (2006). “Aspects of PROLOG History: Logic Programming and Professional Dynamics” TeamEthno-Online Issue 2, June 2006.

Richard Routley (1979) “Dialectical Logic, Semantics and Metamathematics” Erkenntnis 14

Richard Routley Relevant Logics and Their Rivals 1 Ridgeview. 2003.

Jeff Rulifson, Jan Derksen, and Richard Waldinger. “QA4, A Procedural Calculus for Intuitive Reasoning” SRI AI Center Technical Note 73. November 1973.

Bertrand Russell. “Mathematical logic as based on the theory of types.”. American Journal of Mathematics. 1908.

Earl Sacerdoti, et. al., “QLISP A Language for the Interactive Development of Complex Systems” AFIPS. 1976.

Eric Sandewall. “A functional approach to non-monotonic logic” Computational Intelligence. Vol. 1. 1985.

Eric Sandewall. From Systems to Logic in the Early Development of Nonmonotonic Reasoning. CAISOR. July, 2006.

Davide Sangiorgi and David Walker. The Pi-Calculus: A Theory of Mobile Processes Cambridge University Press. 2001.

Page 32: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 32 of 36

Marek Sergot. “Bob Kowalski: A Portrait” Computational Logic: Logic Programming and Beyond: Essays in Honour of Robert A. Kowalski, Part I Springer. 2004.

Dana Scott. “The Future of Proof” LICS 2006. http://www.easychair.org/FLoC-06/scott_goedel_keynote_floc06.pdf

Thoralf Skolem (1920) “Logico-combinatorial

investigations on the satisfiability or provability of

mathematical propositions: A simplified proof of a

theorem by Löwenheim” (English translation in Jean van

Heijenoort, 1967. From Frege to Gödel: A Source Book in Mathematical Logic, 1879-1931. Harvard Univ. Press)

Natarajan Shankar. Metamathematics, Machines, and Gödel‟s Proof Cambridge University Press. 1994.

Ehud Shapiro. “The family of concurrent logic programming languages” ACM Computing Surveys. September 1989

Stewart Shapiro. “Lakatos and logic Comments on Graham Priest‟s „60% proof: Lakatos, proof, and paraconsistency‟” Preprint 2006 http://garnet.acns.fsu.edu/~tan02/OPC%20Week%20Three/Commentary%20on%20Priest.pdf#search=%22paraconsistency%202006%20filetype%3Apdf%22

Wilfried Sieg and Clinton Field. “Automated search for Gödel proofs.” Annals of Pure and Applied Logic. 2005.

Aaron Sloman. “Must Intelligent Systems Be Scruffy?” Evolving Knowledge in Natural Science and Artificial Intelligence. Pitman. 1990.

Peter Smith. An Introduction to Gödel‟s Theorems. Draft. 2006. http://www.godelbook.net/

Lee Smolin. The Trouble with Physics: The Rise of String Theory, the Fall of a Science, and What Comes Next Houghton Mifflin. 2006

Craig Smorynski. “The Incompleteness Theorems” Handbook of Mathematical Logic. North Holland. 1977.

Michael Smyth. Power domains Journal of Computer and

System Sciences. 1978.

Gerry Sussman, Terry Winograd and Eugene Charniak. “Micro-Planner Reference Manual (Update)” AI Memo 203A, MIT AI Lab, December 1971.

Alfred Tarski (1944) “The semantic conception of truth

and the foundations of semantics” Philosophy and

Phenomenological Research 4 (Reprinted in Readings in Philosophical Analysis, Appleton-1944)

Alfred Tarski and Robert Vaught (1957). “Arithmetical extensions of relational systems” Compositio Mathematica 13.

Paul Tillich. Courage to be Yale University Press. 2000. Alan Turing. “On computable numbers, with an application to

the Entscheidungsproblem.” Proceedings London Math Society. 1936.

Shunichi Uchida and Kazuhiro Fuchi (1992). Proceedings of the FGCS Project Evaluation Workshop Institute for New Generation Computer Technology (ICOT)

Jean van Heijenoort (1967) From Frege to Gödel. A Source Book in Mathematical Logic, 1897-1931, Harvard University Press.

Rineke Verbrugge (2003). "Provability Logic", The Stanford Encyclopedia of Philosophy Summer 2003 Edition.

Richard Waldinger and R. Lee (1969) “PROW: a step toward automatic program writing” IJCAI‟69.

Peter Whalley. “Modifying the metaphor in order to improve understanding of control languages—the little-person becomes a cast of actors.” British Journal of Educational Technology. 2006.

Bill Wilson (1952) Twelve Steps and Twelve Traditions Alcoholics Anonymous.

Terry Winograd. Procedures as a Representation for Data in a Computer Program for Understanding Natural Language. MIT AI TR-235. January 1971.

Ludwig Wittgenstein. Remarks on the Foundations of Mathematics. MIT Press. 1978.

Larry Wos, George Robinson, Daniel Carson (1965) “Efficiency and Completeness of the Set of Support Strategy in Theorem Proving” JACM 12(4).

Aki Yonezawa. Specification and Verification Techniques for Parallel Programs Based on Message Passing Semantics MIT EECS Ph. D. December 1977.

Ernst Zermelo. “Investigations in the foundations of set theory” (English translation in From Frege to Gödel: A Source Book in Mathematical Logic, 1879-1931 Ed. Jean van Heijenoort 1967). 1908.

Page 33: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 33 of 36

Appendix. Additional Principles of Direct Logic

This appendix contains additional principles of Direct

Logic.

Relevance Logic

Direct Logic is related to Relevance Logic [Mares 2006]

which attempts to weed out certain inferences as

unconvincing because they involve the introduction of

irrelevancies. However, according to [Routley 1979], “The

abandonment of disjunctive syllogism is indeed the

characteristic feature of the relevant logic solution to the

implicational paradoxes.” Since Direct Logic incorporates

disjunctive syllogism ((ΦΨ), ¬Φ ├ Ψ), it is not a

Relevance Logic [Dunn and Restall 2002]. Relevance

Logic is unsuited for practical reasoning about large

software systems because it lacks standard Boolean

equivalences, a useable Deduction Theorem, and a natural

deduction proof system.

Classical logic allows many seeming irrelevancies to

slip in that are not valid in the strongly paraconsistent

theories of Direct Logic as in the following:

Classical Logic Direct Logic

├ (⇨ (⇨ )) ⊬⊥ (⇨ (⇨))

├ ((⇨ ) (⇨ )) ⊬⊥ ((⇨) (⇨))

├ (( )⇨ ) ⊬⊥ (( ) ⇨ )

├ (⇨ ( )) ⊬⊥ ( ⇨ ())

├ ( ) ⊬⊥ ( )

However, note that the following hold in Direct Logic:

Direct Logic

, ├⊥

├⊥ (├⊥ )

├⊥ ()

(⇨ )├⊥ (⇨ (⇨ ))80

Equality

Note that, in Direct Logic, equality (=) is not

defined on (abstract) propositions.

80 Contrary to [Besnard and Schaub 2003]

Direct Logic has the following usual principles for

equality:

1=1

1=2 ⇨ 2=1

(1=2 2=3)⇨ 1=3

Nondeterministic λ-calculus

Direct Logic makes use of the nondeterministic

λ-calculus as follows:

o If E1 and E2 are expressions, then E1 E2 (E1 can

reduce to E2 in the nondeterministic λ-calculus) is a

proposition.

o If E is an expression, then E (E always converges

in the nondeterministic λ-calculus) is a proposition.

o If E is an expression, then E (E is irreducible in

the nondeterministic λ-calculus) is a proposition.

o If E1 and E2 are expressions, then E1E2 (E1 can converge to E2 in the nondeterministic λ-calculus)

is a proposition.

o If E is an expression, then 1E (E reduces to

exactly 1 expression in the nondeterministic λ-

calculus) is a proposition. Basic axioms are as follows:

(true = =false) false

(false = =true) false

(if true then E1 else E2) E1

(if false then E1 else E2) E2

(1 2) (2 3)) ⇨ (1 3)

(λ(x) F(x))) F() reduction

1 | 2 1 nondeterministic reduction to first alternative

1 |2 2 nondeterministic reduction to second alternative

F1 F2 ⇨ F1() F2() an application reduces if its operator reduces

1 2 ⇨ F(1) F(2) an application reduces if its operand reduces

1 2 ⇨ (1 2)

1 2 ⇔ ((1 2 2) (1 1 =2))

1 ⇔ ( (1 2) ⇨ 1=1)

⇨ E=E

1 ⇨ (1 2)

⇨ (λ(x) E) E1=E2 ⇨ (1E1 ⇔ 1E2)

(E1= =E2) ⇔ (E1 E2)

(1=2 1F) ⇨ F(1)=F(2)

(F1=F2 1) ⇨ F1()=F2()

P[E] ⇨ (1P 1E)

Page 34: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 34 of 36

(1=2 1P) ⇨ (P[1] ⇨ P[2])

1F ⇨ F=(λ(x) F(x)) abstraction

Set Theory

The set of all sets in Direct Logic is called Sets and is

axiomatised below.

x: x{ } the empty set { } has no elements

sSets: { }s { } is a subset of every set Since Direct Logic uses choice functions instead of

existential quantifiers, we have the following axiom:

sSets: s{ } ⇨ Choice(s)s

Note that SetsSets.

The basic axioms of set theory are:

s1,s2Sets; x: s1s2 ⇨ (xs1 ⇨ xs2) if s1 is a subset of s2 ,then x is an element

of s1 implies x is an element of s2

s1,s2Sets: (s1={ }SubsetChoices2(s1)s2)⇨s1s2

where

s1,s2Sets: s1≠{ } ⇨ SubsetChoices2(s1)s1

if s1 is empty or the choice of an element of

s1 (depending in an arbitrary way on s2) is

also an element of s2 ,then s1 is a subset of

s2

x; s1,s2Sets: xs1s2 ⇔ (xs1 xs2)81

x; s1,s2Sets: xs1s2 ⇔ (xs1 xs2)

x; s1,s2Sets: xs1-s2 ⇔ (xs1 xs2)

x; y: x{y} ⇔x=y

The function Count is defined as follows:

Count(s) ≡ if s={ } then 0 else 1+Count(s-{Choice(s)})

sSets: Finite[s] ⇔ Count(s) a set s is finite if and only if Count(s) converges

The integers can be defined as follows using the

nondeterministic λ-calculus:

IntegerGenerator() ≡ 0 | (1+IntegerGenerator()) IntegerGenerator() is the nondeterministic choice of

0 and 1+IntegerGenerator()

x: x ⇔ IntegerGenerator()x x is an integer if and only if Integer converges to x

81 In general we have the following: Suppose that s is a

nonempty set

x: xis F(i) ⇔ xF(UnionChoiceF(s,x))

where x: UnionChoiceF(s,x))s

Noncompactness

The Actor model makes use of two fundamental orders

on events [Baker and Hewitt 1977; Clinger 1981, Hewitt

2006]:

1. The activation order (≈≈>) is a fundamental order that

models one event activating another (there is energy flow

from an event to an event which it activates). The

activation order is discrete:

e1,e2Events: Finite[{eEvents | e1 ≈≈>e ≈≈>e2}]

2. The arrival order of a serialized Actor x ( 𝐱 ) models

the (total) order of events in which a message arrives at

x. The arrival order of each x is discrete:

e1,e2Events: Finite[{eEvents | e1 𝐱 e

𝐱 e2}]

The combined order (denoted by →) is defined to be the

transitive closure of the activation order and the arrival

orders of all Actors. So the following question arose in the

early history of the Actor model: “Is the combined order

discrete?” Discreteness of the combined order captures an

important intuition about computation because it rules out

counterintuitive computations in which an infinite number

of computational events occur between two events (à la

Zeno).

Hewitt conjectured that the discreteness of the activation

order together with the discreteness of all arrival orders

implies that the combined order is discrete. Surprisingly

[Clinger 1981; later generalized in Hewitt 2006] answered

the question in the negative by giving a counterexample.

The counterexample is remarkable in that it violates the

compactness theorem for 1st order logic:

Any finite set of sentences is consistent (the activation

order and all arrival orders are discrete) and represents a

potentially physically realizable situation. But there is

an infinite set of sentences that is inconsistent with the

discreteness of the combined order and does not

represent a physically realizable situation.

The counterexample is not a problem for Direct Logic

because the compactness theorem does not hold. The

resolution of the problem is to take discreteness of the

combined order as an axiom of the Actor model:82

e1,e2Events: Finite[{eEvents | e1 → e → e2}]

Direct Logic is based on XML

We speak in strings, but think in trees.

---Nicolaas de Bruijin83

The base domain of Direct Logic is XML84

. In Direct

Logic, a dog is an XML dog, e.g.,

<Dog><Name>Fido</Name></Dog>DogsXML

82 The axiom can be justified using results from General

Relativity 83 Quoted by Bob Boyer [personal communication 12 Jan. 2006].

Page 35: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 35 of 36

Unlike First Order Logic, there is no unrestricted

quantification in Direct Logic. So the proposition

dDogs Mammal[d] is about dogs in XML. The base equality built into Direct Logic is equality for XML, not

equality in some abstract “domain”. In this way Direct

Logic does not have to take a stand on the various ways

that dogs, photons, quarks and everything else can be

considered “equal”!

This axiomization omits certain aspects of standard XML,

e.g., attributes, namespaces, etc.

Two XML expressions are equal if and only if they are

both atomic and are identical or are both elements and have

the same tag and the same number of children such that the

corresponding children are equal.

The following are axioms for XML:

(Atomics Elements) = XML

(Atomics Elements) = { } Atomics and Elements are disjoint

Tags Atomics

x: xElements ⇔ x= <Tag(x)> x1…xLength(x) </Tag(x)>

where xi is the ith subelement of x and Tag(x) is the tag of x Length(x) is the number of subelements of x

A set pXML is defined to be inductive (written

Inductive[p] ) if and only it contains the atomics and for all

elements that it contains, it also contains every element

with those sub-elements :

(pXML; x1…xnp; tTags:

Inductive[p] ⇨ (Atomics p <t> x1…xn</t>p)

The Principle of Induction for XML is as follows:

pXML: Inductive[p] ⇨ p = XML

XML Plus (XML+) is the domain of Direct Logic that is

obtained by first extending the Atomics (described above)

with Actors85

(see [Hewitt 2009c]) in order to create

XMLwithActors.. Then XML+ is defined recursively by the

following axioms:

XML+0 ≡ XMLwithActors

i; x: (xXML+i+1 ⇔ xXML+

i )

XML+ ≡ i XML+i

84 Lisp was an important precursor of XML. The Atomics

axiomatised below correspond roughly to atoms and the

Elements to lists. 85 λ-expressions are a subset of Actors (see appendix below)

The universe of sets can be defined as follows:86

Sets ≡ XML+ - XMLwithActors

Subsets of elements of XML+ can be defined using the

following Restricted Comprehension Axiom:

d; e: e{ Xd | P[X] } ⇔ (P[e] ] ed)

Theorem. XML+ is the universe, i.e.,87

⇨ (EXML+ EXML+)

Provably Inference Reflected Propositions in

Theories of Direct Logic

Don’t believe everything you think.

Thomas Kida [2006]

Provably Inference Reflected propositions for T are those

such that

├T

((├T

) ├T

)

Naively one might suppose that the above proposition could

be taken as an axiom of Direct Logic. The naive intuition is

that if a proposition is provable in a theory, then it can be

inferred in the theory. However, as shown below, if the

above proposition were taken as an axiom, then every

proposition would be provable!88

A way to understand this paradox is as follows:

Definition.

86 Note that SetsSets 87 What about Cantor‟s set defined as follows:

Cantor ≡ {xXML+ | xXML+ } Clearly CantorXML+. This illustrates that Cantor is not all

subsets of XML+, just the ones whose elements are in XML+. For

example XML+Cantor even though XML+XML+ because

XML+XML+. It is impossible in Direct Logic to get “outside”

XML+ and its subsets. 88 Modulo questions of Admissibility

PrInfers ≡ Fix(Diagonalize)1

where Diagonalize ≡ λ(s) (├T

s ) ├T

In Direct Logic, simply because a

proposition is provable in a theory (i.e.,

there is an argument in the theory for

the proposition) is not by itself

sufficient to infer in the theory that the

proposition holds. Instead, arguments

both for and against the proposition

should be considered.

Page 36: Common sense for concurrency and strong paraconsistency ...reflection among the mutually chock full of inconsistencies code, documentation, and use cases of large software systems

September 26, 2009 Page 36 of 36

Theorem89

: If is Provably Inference Reflected for T

and (├T

PrInfers)├T is Admissible for T,

then ├T

Proof:

Suppose that is provably infers reflected for T and

(├T

PrInfers) ├T is Admissible for T.

It is sufficient to prove ├T

Lemma: ├T

(PrInfers((├T

PrInfers)├T )))

Proof:

PrInfers Fix(Diagonalize) Diagonalize (Fix(Diagonalize))

λ(s) (├T

s )├T (Fix(Diagonalize))

(├T

Fix(Diagonalize) )├T

(├T

PrInfers)├T

((├T

PrInfers) ├T

)

by Admissibility of (├T

PrInfers)├T

Proof of theorem90

Suppose ├T

((├T

)├T

)

We need to show that ├T

├T

(PrInfers ├T

((├T

PrInfers) ├T )) lemma

├T

((├T

PrInfers) ├T

(├T

((├T

PrInfers) ├T )))

soundness on above

├T

((├T

PrInfers)├T

(├T

((├T

├T

PrInfers)├T

(├T

))))

soundness on (├T

PrInfers)├T

├T

((├T

PrInfers)├T

(├T

├T

PrInfers)) adequacy

├T

((├T

PrInfers) ├T

(├T

)) detachment

├T

((├T

PrInfers) ├T

) transitivity on hypothesis

├T

PrInfers transitivity on lemma

├T

├T

PrInfers adequacy on ├T

PrInfers

├T

((├T

├T

PrInfers) ├T

(├T

))

soundness on (├T

PrInfers) ├T

├T

((├T

PrInfers) ├T

(├T

))

adequacy on ├T

PrInfers

├T

├T

detachment on ├T

PrInfers

├T

faithfulness on ├T

89 Generalization of Löb‟s Theorem [Löb 1955]. Also see [Dean

and Kurokowa 2009]. 90 The proof is an adaptation for Direct Logic of [Löb 1955;

Verbrugge 2003].