21
General Information in Relevant Logic Edwin D. Mares Victoria University of Wellington October 24, 2008 Abstract This paper sets out a philosophical interpretation of the model the- ory in Mares and Goldblatt An Alternative Semantics for Quantied Relevant Logic (The Journal of Symbolic Logic 71 (2006)). This inter- pretation distinguishes between truth conditions and information condi- tions. Whereas the usual Tarskian truth condition holds for universally quantied statements, their information condition is quite di/erent. The information condition utilizes general propositions. The present paper gives a philosophical explanation of general propositions and argues that these are needed to give an adequate theory of general information. 1 Introduction This paper is a sequel to (19). In that article, we set out a formal semantics for quantied relevant logic. 1 Here I provide a philosophical interpretation for that semantics. This interpretation is undertaken in the framework of an infor- mational semantics. An informational semantics, on my view, does not set out truth conditions for the connectives. Rather, it gives information conditions for the various connectives. For most of the standard connectives the usual classical truth conditions hold. An information condition is a condition that holds in states of partial information situations that indicates that the truth condition for the statement in question obtains. My particular purpose is to set out an information condition for the universal quantier. The nature of general information is a puzzle that has been familiar to philosophers at least since Russell discussed it in (21). We can know of each mammal in my lounge, for example, that it is a dog, without thereby having the information that all the mammals in my lounge are dogs. Some extra information is needed. The problem, then, is to formulate what this extra 1 The formal problem that was solved by our semantics was that the standard quantied relevant logics (which we call QRand RQ) were incomplete over a standard Tarskian semantics (as Kit Fine showed (12)). Fine also constructed a semantics over which RQ is complete, but it is very complicated and not very intuitive (although extremely ingenious) ((11)). 1

General information in relevant logic

Embed Size (px)

Citation preview

General Information in Relevant Logic

Edwin D. MaresVictoria University of Wellington

October 24, 2008

Abstract

This paper sets out a philosophical interpretation of the model the-ory in Mares and Goldblatt �An Alternative Semantics for Quanti�edRelevant Logic� (The Journal of Symbolic Logic 71 (2006)). This inter-pretation distinguishes between truth conditions and information condi-tions. Whereas the usual Tarskian truth condition holds for universallyquanti�ed statements, their information condition is quite di¤erent. Theinformation condition utilizes general propositions. The present papergives a philosophical explanation of general propositions and argues thatthese are needed to give an adequate theory of general information.

1 Introduction

This paper is a sequel to (19). In that article, we set out a formal semanticsfor quanti�ed relevant logic.1 Here I provide a philosophical interpretation forthat semantics. This interpretation is undertaken in the framework of an infor-mational semantics. An informational semantics, on my view, does not set outtruth conditions for the connectives. Rather, it gives information conditionsfor the various connectives. For most of the standard connectives the usualclassical truth conditions hold. An information condition is a condition thatholds in states of partial information �situations �that indicates that the truthcondition for the statement in question obtains.My particular purpose is to set out an information condition for the universal

quanti�er. The nature of general information is a puzzle that has been familiarto philosophers at least since Russell discussed it in (21). We can know ofeach mammal in my lounge, for example, that it is a dog, without therebyhaving the information that all the mammals in my lounge are dogs. Someextra information is needed. The problem, then, is to formulate what this extra

1The formal problem that was solved by our semantics was that the standard quanti�edrelevant logics (which we call �QR� and �RQ�) were incomplete over a standard Tarskiansemantics (as Kit Fine showed (12)). Fine also constructed a semantics over which RQ iscomplete, but it is very complicated and not very intuitive (although extremely ingenious)((11)).

1

information is and to show how our formal semantics adequately represents ourinformal understanding.I begin the paper by presenting my version of informational semantics. This

is the theory of situated inference developed in (17). I then make some modi�-cations and additions to the theory of situated inference both because they arephilosophically warranted and they are required by my theory of general infor-mation. One particular addition is the use of a class of propositions. I arguethat the use of propositions in the semantics is philosophically innocuous. Oncethe framework is set down, I proceed to develop a theory of general information.I give a brief and relatively informal presentation of the formal semantics, mo-tivating it by appeal to the natural deduction system for the relevant logic R. Ithen examine three examples of general information: an ordinary case of visualinformation; a case of legal information; and a case in which there is mathemat-ical information. I then make the theory more realistic by adding an existencepredicate to the logic. This allows us to vary the domains from situation tosituation. The paper ends with an appendix that contains the model theory.

2 The Need for Situations

I use a theory of information as a semantics for a given logic. This is a very dif-ferent project to that of constructing a logic of information �ow.2 The languageof a logic of information �ow needs to represent the nodes and relations involvedin information �ow, such as agents, changes of states of agents, informationalchannels, and so on. To give a semantics for a logic, in contrast, is to give atheory of what the various particles of that language represent. On my view, astatement represents both its truth conditions and its information conditions.This bifurcated semantics will be presented later. First we need to present aframework for this semantics. This framework is the theory of situations.I take the basis for the theory of situations from Barwise and Perry (8). A

situation is an abstract structure. It is constituted in part by a set of fact likeentities called �states of a¤airs�. There are various ontologies of states of a¤airsin the literature (see, e.g., (4), (7)). I do not assume any particular such theory;any will do for my purposes. Situations carry information. How they do so willbe dealt with in the next section below. A situation may accurately describea possible world or it may fail to do so. In accurately describing a world, asituation need not describe everything that is true of the world, but rather allthe information carried by the situation must be true of that world. A situationwhich accurately describes a possible world is said to be a possible situation anda situation which does not accurately represent any possible world is said to beimpossible.Why use situations? The reason comes from the proof theory of relevant

logic. Relevant logic was created to avoid the so-called paradoxes of material

2Most of the other contributions to this special issue are attempts to do the latter. I thinkthat the two projects are compatible. Exactly what they have to do with one another, though,is far from clear. It would be an excellent project to look at the relationship between them.

2

and strict implication. In terms of the natural deduction system for relevantlogic3 the paradoxes are avoided by imposing a condition that all premises orhypotheses of a proof really be used in the proof. To understand how this worksand what it has to do with the use of situations, let�s consider an example.The paradox that we will look at is: A! (B _:B). It is a paradox because

A appears to have nothing to do with B_:B. Standard classical proofs for thishave the following form: ������

Asome proof ofB _ :B

hyp

A! (B _ :B) ! I

Here the hypothesis that A is merely tacked on the start and then dischargedin the last step, even though it may have had nothing to do with the proof ofB _ :B.In order to reject such proofs, Anderson and Belnap add subscripts to each

line of a proof to indicate which hypotheses or premises were used in the deriva-tion of that line. For example, here is a relevant proof of A! ((A! B)! B):

1:2:3:4:5:

����������Af1g������A! Bf2gAf1gBf1;2g

(A! B)! Bf1g

hyphyp1; reit2; 3; ! E2� 4; ! I

6: A! ((A! B)! B); 1� 5; ! I

When a hypothesis is introduced, it is given a number. When a hypothesis isused in rules such as ! E, its number is added to the subscripted set of theconclusion of that rule. When a hypothesis is discharged as it is in an instanceof ! I, its number is removed from the subscripted set of the conclusion of therule. An empty set subscripted to a line indicates that the formula on that lineis a theorem of the logic.We won�t go into the rules of proof for all the connectives here. There

are plenty of sources in which one can �nd them (e.g., (2), (19)). What is ofinterest to us here is the interpretation of the subscripts. I interpret a hypothesisor premiss in a proof, Afjg, as meaning that the information that A is carriedby a situation sj . When one makes this hypothesis in a proof, what she is sayingis �assume that there is a world in which a situation j obtains and j carries theinformation that A�. When further hypotheses are made, say, Bfkg, the persondoing the proof is also assuming that there is a situation sk in the same worldwhich carries the information that B. When we have multiple numbers in thesubscript, as in Cf1;:::;ng, in the same proof, what is shown is that there is asituation in the same world as s1; :::; sn which carries the information that C

3Here and throughout the paper, by �relevant logic� I mean the logic R of relevant impli-cation. This is somewhat unfair. There are many other (in�nitely other) relevant logics andwhat I say cannot be applied to all or even most of them.

3

and the information in all of s1; :::; sn is really used to derive that this is so.This is the core of the theory of situated inference developed in (17).Note that this semantical analysis appeals to the notion of real use; it is

not de�ned in the semantical theory but rather it is assumed by the semanticaltheory (see (17)).The fact that situations provide partial representations of worlds is crucial

to their use in the semantics of relevant logic. If all situations were complete,in the sense that they all carried the information that every instance of thelaw of excluded middle held, then on this semantical analysis it would be verydi¢ cult to invalidate the formula A ! (B _ :B). In fact, for each formula ofour language, we need at least one situation in some model at which it fails.4

3 Situations and Worlds

In (17), I held that a world w made true a statement A if and only if there wasa situation s in w such that s j= A. I think that this biconditional still holdsat least of the actual world, but I no longer think that a statement is true at aworld because the information that it is true is carried at some situation in thatworld. It seems to me that this inverts the real direction of explanation.I suggest that the classical truth conditions for conjunction, disjunction,

negation, and the quanti�ers are correct. These conditions, relativized to pos-sible worlds, of course are the following:

� A ^B is true at w if and only if A is true at w and B is true at w

� A _B is true at w if and only if A is true at w or B is true at w

� :A is true at w if and only if A is not true at w

� 9xA is true at w if and only if A is true at w on some interpretation ofthe variable x

� 8xA is true at w if and only if A is true at w for all interpretations of thevariable x

The standard truth conditions for the connectives have the nice propertythat they are homomorphic, that is, they are interpreted very directly in termsof the cognate connectives of the metalanguage. So, according to this theory,�and�means �and�(in the metalanguage), �or�means �or�, and so on. Thus, itwould seem to be a virtue of a semantic theory if it could retain the classicaltreatment of these connectives.But there is a problem. As I said in the introductory section above, the

semantics that I use for relevant logic is informational. That is, the logical4At least this is true for standard versions of propostional and �rst order relevant logics.

If we add the so-called Church constants T and F , we make T hold at every situation and Ffail at every situation. We then we also add all instances of the schemes theorems A! T andF ! A. Also, if we have propositional quanti�ers, we make 9pp hold at every situation and8pp fail at every situation.

4

properties of a statement are to be understood primarily in terms of the infor-mation that is conveyed by it. Information and truth are closely linked, butthey are not identical. While I am writing this sentence I am travelling in atrain carriage. I have certain information available in my surroundings. I cansee that the person sitting across from me is reading a magazine in French, Ican hear that others are eating, talking too loudly on cellphones, and so on.Other statements are true of this carriage �such as those that stipulate whereit was made, how old it is, and so on �but the truth of these is not among theinformation available to my senses. Not all information need be experientiallyavailable to one, but all of it is situated. In a given world, situations carry someinformation and others carry di¤erent information. Truth is not like this. In agiven possible world, a proposition either true or false.Thus I distinguish between truth conditions and information conditions. The

information condition for a connective generalizes the conditions under whichwe have information of a given type available in actual contexts. The mostobvious condition is for conjunction, which merely repeats the truth condition,viz, s j= A ^ B if and only if s j= A and s j= B. (Here �s j= A� means�s carries the information that A�.) In section 4 below, we will look brie�yat information conditions for negation and implication and after that we willexamine the information condition for the universal quanti�er. We will set asidethe information conditions for disjunction and the existential quanti�er, sincethey are rather di¢ cult and deserve to be discussed in a paper dedicated justto them.As a simplifying assumption, we treat all possible worlds and all situations

as having the same domain of individuals, which we call I. Clearly, in our �nalview, we want to have situations have di¤erent domains of individuals. By andlarge, situations are supposed to carry only partial information about worlds.Situations should not all, therefore, tell us exactly which individuals exist. Insection 11 below, I remedy this by showing how to alter the semantics to treatvariable domains.

4 Information Conditions

In this semantical theory, each connective has associated with it an informationconditions. The information conditions generalize the conditions under whichinformation of the salient sorts is available to us in concrete contexts. Before weturn to the information conditions for the universal quanti�er, let�s look brie�yat two other connectives �negation and implication �so that we can see tworather di¤erent ways of determining information conditions.First, we will consider negation. A situation, as we have said, may be partial.

But it does represent worlds as making true particular statements or proposi-tions. For example, as I write this sentence, it is sunny in Wellington. Thus, thesituation that captures the information to which I have visual access representsWellington as being not cloudy. It does so by containing information that isincompatible with its being cloudy (i.e. the information that it is sunny). We

5

generalize this sort of example by placing a binary relation (the incompatibil-ity relation) on situations and then formulating a general information conditionusing it. The information that :A is carried by a situation s if and only if s isincompatible with any situation which carries the information that A.5

My information condition for implication is motivated by the implicationproperties of that connective. According to the theory of situational inference,the primary sort of inference that we make with implicational information is aform of modus ponens. Suppose that we have the information in a situation sthat A ! B. This tells us that we may draw certain conclusions from certainsorts of hypotheses. In particular, let us hypothesize that there is a situation t inthe same world as s such that t carries the information that A. Our implicationalinformation allows us to infer that there is also a situation u in the same worldas s and t such that u carries the information that B.The information condition for implication, then, is just this: s j= A ! B

if and only if, for any situation t that carries the information that A, if t ishypothesized to obtain in the same world as s, then (really using the informationin both s and t) we can derive that there is also situation in that world whichcarries the information that B.There are a few points that need to be made about this information condi-

tion. First, the notion of the real use of hypotheses appears in the informationcondition for the connective (as well as in the description of the natural de-duction system). One might have expected that the semantics for the naturaldeduction system gave an explanation of real use, instead of utilizing that verysame notion. Instead, the notion of real use has an intuitive meaning (that Ithink is clear) and a more technical contextual de�nition in terms of the naturaldeduction system and in the semantic theory. The completeness theorem, in myopinion, shows that these two de�nitions succeed in de�ning what is in e¤ectthe same notion.Second, it might seem disappointing to some6 that the inferential properties

are used to motivate a semantical analysis rather than having the semanticsjustify the inferential properties of the connective. I think that this is inevitablewith an informational semantics. The way in which information is made avail-able to us must closely re�ect the way in which we obtain and manipulate it.Third, one might wonder under what conditions there is su¢ cient informa-

tion available to make inferences of this sort. Information that allows us tomake situated inferences is of a sort that (17) calls �informational links�. Aninformational link is a perfectly reliable connection, such as a law of nature or aconvention. Thus, for example, if we have the information that it is a law thatevery physical object attracts every other physical object, then given a situation(real or hypothetical) which carries the information that i and j are physical,we can infer that there is also a situation in the same world in which i attractsj.

5This analysis of negation in relevant logic by mean of a compatibility relation is due toDunn (10).

6Johann Van Benthem expressed this very disappointment. I am indebted to him formaking me face up to this issue.

6

Not only are informational links responsible for situated inference, but weare also allowed to manipulate hypotheses by logical means. For example, wecan rearrange hypotheses in whatever order we wish. Suppose that s j= A !(B ! C). Let us hypothesize that there is a situation in the same world as swhich carries the information that A and a situation in that same world whichcarries the information that B. Then we can infer that there is a situation thatcarries the information that C. There is nothing in the order of these hypothesesthat seems to a¤ect the inference. That is, we can make the hypothesis �rst thatthere is a situation that carries the information that B then the hypothesis thatthere is one that carries the information that A. We would still be entitled toinfer that there is a situation that carries the information that C. Thus, by theinformation condition for implication, it would seem that s j= B ! (A ! C).This shows that allowing the rearrangement of hypotheses makes valid certaininferences.Note that the concept of an informational link that I utilize here is very

di¤erent from the one in channel theory, which is also used as a basis for asemantics for relevant logic (Barwise (1993), Restall (1996)). A channel is anentity that connects pairs of situations. It can be considered to be a situation aswell. Consider an example from Restall (1996). Consider the statement: for anynumber n, if n is even so is n+2. Restall represents this as saying that from anysituation which carries the information that a particular number n is even, thereis also a situation that carries the information that n+2 is even. And connectingthese two situations is another situation that carries the rules of arithmetic thatyield a proof that n+2 is even given that n is even (ibid. 471f). The philosophicaldi¤erence between my semantics and the Barwise-Restall semantics is that theyconsider links to be situations (or situation like entities), whereas for me they areinformation that is contained in situations. The formal di¤erence is that channeltheory is most naturally used as a semantics for weaker relevant logics that, inparticular, do not contain this version of modus ponens: ((A! B) ^ B)! B.For in order to make valid that thesis, all models would have to be constrainedso that every channel is a channel from itself to itself, which is not an easypostulate to motivate.

5 Propositions

Before we can go on to examine the information condition for universally quanti-�ed statements, we will need to understand another key notion in our semanticaltheory. This notion is that of a proposition. A proposition here is to be takenin its usual sense of what is believed, denied, desired, wondered about, andso on. That is, a proposition is a possible object of propositional attitudes.Our propositions are unstructured, like UCLA propositions, but are taken tobe sets of situations rather than sets of worlds. I don�t deny that there may bestructured propositions in addition to unstructured ones, but as the bearers oflogical relations (especially entailments), I think that unstructured propositionsare a necessary part of semantics. The key relation that we have in mind is

7

entailment. On the unstructured proposition view, entailment is just the subsetrelation, that is, a proposition � entails a proposition � if and only if � � �.The Routley-Meyer semantics for relevant logic �which is the formal basis

for my philosophical semantics below) �places a partial order, v, on situations.7On my reading, taken from Barwise and Perry�s understanding of a very similarrelation, v is a relation of informational containment. This means that forsituations s and s0, s v s0 if and only if all the information carried by s isalso carried by s0. Thus, it would seem that since they are to be taken torepresent information, propositions must be closed upwards under v. In moremathematical terminology, propositions are �upsets�. We will use this factabout propositions again in section 10.1 below.Not just any upset of situations should count as a proposition in our se-

mantics. Our semantics, as we have said, is informational; in particular it issupposed to capture the way in which we understand the world. Thus, thelimits of what counts as a situation depends on the discriminatory capacities ofhuman beings. A proposition is a set of situations between all of which peoplecould �nd important similarities. Arbitrary sets of situations may not satisfythis requirement. It falls out of our view of propositions that every statementexpresses a proposition. That is, for any formula A, the set of situations thatcarry the information that A is a proposition.Note that the entailment relation and the informational containment relation

are not identical (they are, in a certain sense, dual to one another). Propositions�which are sets of situations �entail one another. The containment relationrelates situations. If a v b, then the total information in b entails all theinformation in a. In that sense they are dual to one another.

6 General Information

The universal quanti�er is like implication. It is easiest to begin with its infer-ential properties, since its information condition is not immediately obvious (ifit were, I wouldn�t need to write this paper). To make matters easier, let ussuppose that we have a model in which every individual has a name. We willsee a more rigorous and formal treatment in the appendix below. Clearly, (inthis model) the central inferential purpose of the universal quanti�er is that itallows one to infer from 8xA to A[c=x] for all names c. To licence this inferenceit must be that if s carries the information that 8xA, then it also carries theinformation that A[c=x] for every c.In order to explain the information condition for the universal quanti�er,

we now bring in propositions. As we have said, every statement expresses aproposition. In particular, the statements 8xA and A[c=x] express propositions�the sets of situations j8xF (x)j and jF (c)j respectively. Putting this togetherwith what we said in the previous paragraph, it must be the case that j8xAj is

7A formal presentation of the Routley-Meyer semantics and our extension of it to treat�rst order relevant logic is given in an appendix to the present paper.

8

a subset of jA[c=x]j, that is, that j8xAj entails jA[c=x]j. Generalizing, we obtain

(MC) If s carries the information that 8xAthen s is in some proposition that entails jA[c=x]j for every name c:

I call this the �minimal constraint� on general information. It is minimal be-cause it is derived directly from the central inferential role of the universalquanti�er and the de�nitions of �proposition�and �entailment�.On my view, information conditions are biconditionals. So we should ask:

what needs to obtain other than the consequent of MC in order for s to carrythe information that 8xA? My answer is �nothing�. My reason for this comesfrom looking at the introduction rule for the universal quanti�er. The standardintroduction rule (with the addition of subscripts) is:

y����� ...A(y)�

8xA� 8I

In words: from a subproof of A(y)�, where y does not occur free in any proofon which this subproof depends, we can infer that 8xA(x)�. If we accept thisintroduction rule, then we can derive that the consequent of MC is also su¢ cientfor s j= 8xA:Suppose that the consequent of MC obtains, that is, we have a situation s

which is in a proposition � that entails jA[c=x]j for every name c. Let us add anew propositional constant p to our language and set jpj = �. It is legitimateto claim that, at least in principle, we can add a sentence that expresses �, sinceour notion of a proposition is that of a possible content of thought. Thus, jpjentails jA[c=x]j for every c. We can express this entailment fact in the syntax ofthe natural deduction system by the formula p! A[c=x];. If we have a formulawith the empty set as a subscript, we can add it at any step in a proof (see (2)).Now we can construct the following proof:

1: pfsg Premise2:3:4:

x������p! A;pfsgAfsg

assumption1; reit2; 3; ! E

5: 8xAfsg 2� 4; 8I

It seems, then, that the consequent of MC is also a su¢ cient condition for s tocarry the information that 8xF (x). So, we have the following biconditional:

s carries the information that 8xAif and only if

s is in a proposition that entails jA[c=x]j for every i:

9

I suggest that this biconditional be taken to be the information condition forthe universal quanti�er. Generalizing this condition, we get:

s j= 8xA i¤ there is some proposition � such that(i) � � jA[c=x]j for all names c and (ii) s 2 �:

This condition has the virtue also of making valid the elimination rule for theuniversal quanti�er. For suppose that we have a situation s such that s j= 8xA.Our information condition tells us that s is in a proposition � such that � �jA[c=x]j for all names c. Now consider a term � . We take a name c that has thesame referent as � and (after proving a few lemmas (see (19))), we know thatjA[c=x]j = jA[�=x]j. Thus, s 2 � � jA[�=x]j, so s 2 jA[�=x]j, i.e. s j= A[�=x].Our information condition for 8 is minimal in a philosophical sense. Any

theory that holds that propositions are sets of situations and that accepts theabove introduction and elimination rules for the universal quanti�er must alsoaccept the above biconditional. Although our information condition is minimalin a philosophical sense, it is not minimal in terms of the formal logic. In order toformulize this condition, we have added to the basic model theory of situationsand relations on situations a set of propositions (i.e. a designated set of sets ofsituations). Taking care of this set of propositions can be formally cumbersome,but this model theory has been shown has various formal virtues �it has beenused to characterize a wide range of logical systems (modal and relevant) andused it to prove some independence results (see (13), (14), and (19)). Thus, thecondition can be motivated on a formal as well as philosophical grounds.

7 Russell�s Argument

What we have said so far about the informational nature of �all�mirrors to agreat extent an argument due to Bertrand Russell (21). Russell examines a caseof �exhaustive induction��a case in which one comes to a general conclusionby enumerating all the salient particular facts.For example, Zermela is in my lounge, she is a mammal and she is a dog.

Lola is in my lounge and she is a mammal and a dog. Thus, we conclude,every mammal in my lounge is a dog. But this is not a valid inference. Weneed another piece of information: it would be valid if we knew, for example,that the only mammals in my lounge are Zermela and Lola. But this extrapiece of information is general �it tells us about all the mammals in my lounge(that they are identical either to Lola or Zermela). Russell thinks this point isuniversal:

You can never arrive at a general proposition by inference from par-ticular propositions alone. You will always have to have at least onegeneral proposition in your premisses. ((21) 206)

Thus, there is a sort of circularity about inferences of this sort to generalities.They presuppose that other general propositions hold.

10

Now Russell, and following him, Armstrong ((3) and (4)) hold that thisargument shows that there must be general facts of some sort. They think thatfor every true statement (or perhaps proposition) there must be a thing thatmakes it true (its �truthmaker�). Moreover, Armstrong explicitly and perhapsRussell implicitly believe that a truthmaker makes true a proposition by thetruthmaker�s entailing that proposition. Thus, the failure of an entailment fromthe particular facts about the mammals in my lounge to the proposition thatevery mammal in my lounge is a dog shows that there must be an extra fact ofsome sort.The thesis that truthmakers entail the things that they make true has been

criticized in the literature (e.g. (15)). I do not want to enter into this debatehere. We are interested not in the nature of facts but of information. I do thinkthat Russell�s argument shows this: in addition to the particular facts aboutthe denizens of my lounge there must be some sort of general information if weare to validly infer that every mammal in my lounge is a dog. On my view,entailment �like other logical relations �has to do primarily with information.The failure of entailment shows that we need more information. This extrainformation, in my view, is a general proposition of some sort.Thus, it seems that from the proof theory and Russell�s argument that we

have good motivation for the inclusion of general propositions in our semantics.In what follows we look at the nature of general propositions.

8 A Legal Example

Here we will look at three sorts of cases in which we have general information.I do not pretend that these are exhaustive. I wish I had a complete taxon-omy of types of general information, but I don�t. For the present, I give thesethree rather di¤erent sorts of cases to indicate how we come to have generalinformation and how this process is captured by our semantics.I start with a case that �ts in straightforwardly what we have said so far

about the nature of situations and information.The seventeenth amendment to the Constitution of the United States says

�the Senate of the United States shall be composed of two senators from eachstate�. Thus, it is a law in America that each state can have only two senators.I don�t think that there is currently any judgment on what would happen if astate tried to appoint more than two senators, but for the sake of our examplelet us assume that the appointments would not be legal and no more than two ofthe appointments (if any) would be considered actual appointments. Allowingthis supposition, we can say that it is a legal fact that there can (in an alethicsense) be at most two senators from any state. This fact, and the propositionthat represents it, constitutes an informational link in the sense of section 4above.Here are some facts about American law and the senators from the state of

Wisconsin:

1. It is a fact that Russell Feingold is a senator from Wisconsin.

11

2. It is a fact that Feingold is a Democrat.

3. It is a fact that Herb Kohl is a senator from Wisconsin.

4. It is a fact that Kohl is a Democrat.

5. It is a fact that Russell Feingold and Herb Kohl are numerically distinct.

6. It is a law that there can be at most two senators from any state.

7. The state of Wisconsin has not broken any law to do with the appointmentof Senators.

These facts together with what we have supposed about American law entailthat every senator from Wisconsin is a Democrat. As we have said, the legalproposition that there can be at most two senators from any state is an infor-mational link. When conjoined with the other facts listed here entails that allsenators from Wisconsin are Democrats. Facts 1, 3, 5 and 6 entail that for anyindividual, if it is a senator from Wisconsin, it is identical either to Feingoldor Kohl. This is a general proposition. Together with the facts that Kohl andFeingold are both Democrats this general proposition entails that every Senatorfrom Wisconsin is a Democrat.

9 A Mathematical Example

Now we turn from legal to mathematical information. In proving a universallyquanti�ed statement in mathematics, if we are performing a direct proof (asopposed to a reductio), we prove that a statement with a free variable holds,that is, we show that a statement holds of any arbitrarily chosen object of theappropriate kind. We then infer that the statement holds of all objects fromthat domain.Let�s consider a familiar example. In a standard proof that the set of prime

numbers is in�nite, we show that any prime number we can construct a largerprime number. Here is a sketch of the proof. Suppose that n is an arbitraryprime number. Then we take all of the prime numbers less than or equal to n,multiply them together and add 1. Then, since this new number m is at leastone greater than n, it is strictly greater than n. Moreover, we can show that mis prime (since its only factors are itself and 1). Thus, for every prime number,there is at least one prime number strictly greater than it. Therefore, there arein�nitely many prime numbers.The point that is of interest to us is that we can infer the universally quan-

ti�ed statement �for every prime number there is at least one prime numberstrictly greater than it� from the proof that for an arbitrarily chosen prime nthat there is a prime number strictly greater than n. This of course is justan informal use of the rule of 8I that we give in section 6 above. We are not,however, concerned here with the exact form of the proof, but with the fact that

12

the existence of the proof is the general information that allows the inference ofthe universally quanti�ed statement.Two questions arise here, one metaphysical and the other logical. The meta-

physical question is easy to brush aside in the current context. The question iswhat we mean by a proof here �are we committed to Platonism about proofs?Although my view is compatible with Platonism, it is compatible with almostevery philosophy of mathematics. Any reasonable philosophy of maths will ei-ther treat the expression �there is a proof�literally or adopts some paraphrasethat makes at least some statements of the form �there is a proof of p�true. Anysuch paraphrase should be acceptable in the current context. That is, it shoulddetermine a general proposition of the salient sort.8 ,The logical question is rather more di¢ cult. This is the question of what sort

of proofs count as general information of the requisite sort. Only one answer ispossible in this context: relevantly valid proofs. This raises a further problem.A lot of proofs that mathematicians actually give are not (as stated) relevantlyvalid. For example, they often use the rule of disjunctive syllogism. Luckily, wecan recast many of these proofs as relevantly valid proofs. For example, supposethat we are considering a proof accepted among mathematicians of the form ofa disjunctive syllogism,

p _ q:p) q :

Disjunctive syllogism is not a relevantly valid form of inference. But it is com-monly used by mathematicians and almost everyone else.In order to deal with this problem, in (17) I suggested that inferences of this

form can be thought of as enthymemes. The fully displayed inference is of theform

((p _ q) ^ :p)) q(p _ q) ^ :p

) q :

The double arrow in the major premise is a relevant indicative conditional (de-scribed in chapter 7 of (17)). Now we have a valid inference �it is just a caseof modus ponens. Moreover, I argue in (17) that if the premises of the originalinference are true then the fully displayed inference is sound. In this way wecan recapture a good number of standard mathematical proofs. Whether it issu¢ cient to explain all of what we think of as good mathematical practice isnot clear, but not a topic for the present paper.Before we leave the topic of mathematical information, I want to point out

that I am not claiming that all general mathematical information comes fromproofs. We do not prove all mathematical generalities. Many widely acceptedaxioms (such as Peano�s axioms for arithmetic) are generalities. Thus, there

8 I am grateful to Stephen Read for this view. Read, at one time at least, thought that thetruthmakers for necessities were proofs. So he (at least at that time) went much further thanmy claim that the certain situation carries certain mathematical information because in thosesituations there exist proofs.

13

must be some other sort of mathematical information. What that is, however,I will leave to those working in the epistemology of mathematics.9

10 A Visual Example

My last example is about common empirical experience.Suppose that we see in front of us a glass bowl with three peaches in it. Since

the bowl is glass, if we can picks the bowl up and look it from all sides (or evenremove and replace the peaches) we can see that all the pieces of fruit in thebowl are peaches. It doesn�t seem much time for us to realize that in cases likethis we can make universal generalizations about these peaches and certain oftheir properties and that we can justify these generalizations using exhaustiveinduction. Let�s call this sort of case �locally Tarskian�because it seems thatwe are given the entire domain of pieces of fruit in the bowl and, for certainformulas A, that the general information that 8xA is available in this situationif and only if A holds of each of the peaches we can see.Russell�s argument that we examined in section 7 above, however, shows

that from a logical point of view, there is something interesting about locallyTarskian circumstances. We need not only information about the individualpieces of fruit in order to make generalizations about them, but that theseindividual pieces of fruit are the only ones in the bowl. We need now to discussthe nature of this further (general) information.

10.1 The Closed World Assumption

Our starting point for our discussion of locally Tarskian information is the ideaof closed worlds that I borrow from default logic and arti�cial intelligence. Theterm �closed world assumption�, I think, was introduced into the literature byRay Reiter. As it is used in arti�cial intelligence programming, the closed worldassumption is the assumption that a database or program has available to itall the salient facts. For example, suppose that an insurance database lists anumber of valuable items belonging to a particular policy holder. If the programsearching the database employs the closed world assumption, it will say if askedthat the policy holder does not own any other valuables. Thus, the programassumes that the facts in the database, for the purposes of making inferences,constitute a �closed world�.The technical machinery of the relevant semantics gives us a choice in inter-

preting the closed world assumption in our framework. In order to understand

9An alternative view of mathematical generality can be found in the classic, but nowadaysoften overlooked paper Alice Ambrose (1). Ambrose says that a matematical sentence of theform �All fs are g�is true because of �an empirical truth about language�(p 117). On her view,all necessary truths are true because of the grammar of our language in the Wittgensteiniansense of being appropriately connected by the rules that govern the use of the relevant predi-cates in the language. If this view is right, I can �t it easily into my framework. Conventionalrules of this sort create informational links and can be the bases of general propositions.

14

the distinction between the two alternative interpretations we need to under-stand the distinction between asserting a negative proposition and denying apositive proposition. In the context of logics with non-classical negations likerelevant logic,10 we must distinguish between the assertion of :A and the denialof A. A proposition A is accurately denied in a situation s if and only if A failsto obtain in s.11 The information that the negation :A obtains in s if andonly there is some feature of s that is incompatible with the truth of A. Thedi¤erence here can be understood in terms of the partial order on situations (seesection 5 above). If the denial of A is accurate in s, then it is still possible thatthere are v-extensions of s in which the information that A obtains. Negations,on the other hand, are persistent. If :A obtains in s, then it obtains in allv-extensions of s. We will call these two interpretations of the closed worldassumption, the denial interpretation and the negation interpretation.The denial interpretation is the one that is commonly used in arti�cial intel-

ligence, in particular in connection with the interpretation of negation in Prolog.Here is a quotation from Ivan Bratko�s textbook on Prolog:

According to this assumption the world is closed in the sense thateverything that exists is in the program or can be derived fromthe program. Accordingly then, if something is not in the program(or cannot be derived from it) then it is not true and consequentlyits negation is true. This deserves special care because we do notnormally assume that �the world is closed�.... ((9) 135)

Bratko suggests that :A according to Prolog really means �there is not enoughinformation in the program to prove that A�(ibid. 134-5). This form of negationdoes not match the meaning of �not� in English or the meaning of any othernatural language negation.With regard to negation and, more importantly for our present purposes,

the universal quanti�er, the denial interpretation is not what we need. As wehave seen in section 5 above, propositions are persistent; thus, every generalproposition must also be persistent. This eliminates the denial interpretation.So, it seems we must accept the negation interpretation of the closed worldassumption for our present purposes.12 Thus, in the cases above, we assumethe proposition that there are no more relevant facts concerning the particularsubject matter.

10The case of intuitionist logic is interesting. Intuitionists deny the law of excluded middle,but they cannot assert the negation of any particular example of excluded middle, since:(A _ :A) is a contradiction in intuitionistic logic. David McCarty has convinced me thatthis and other similar cases can be understood by using intuitionist logic in an intuitionistmetalanguage. Thus, the denial of excluded middle is formalized as : ` (A _ :A). But mypoint still holds: if we abandon classical negation, then denials cannot be merely assertionsof negations.11 I have given a semantics for denials and statements that assert that some proposition is

denied in (18).12One might object that I have just used disjunctive syllogism �which notoriously is invalid

in relevant logic � to argue for a thesis. This is true, but I interpret disjunctive syllogism asan enthymeme, which has a valid form. See (17).

15

Note that there is a more important di¤erence here between our locallyTarskian circumstances and the closed worlds of arti�cial intelligence. We donot always assume that our situations are closed in every way, but only inparticular ways. Because of sensory clues, background information, and so on,we sometimes assume that there are no more facts of some salient sort. But asBratko says in the above quote, we usually do not assume that the informationavailable to us is all the information in the universe on a given topic. Thus,our assumptions are much more targeted than the �closed world assumption�to which arti�cial intelligence appeals.The idea is that when we come to think that a circumstance is locally

Tarskian, that there is a range of predicates that we can make generalizationsabout if the salient exhaustive induction is successful. In our example of thepeaches in the bowl, we can infer after looking at smelling each peach that all ofthe peaches in the bowl appear ripe, but we cannot merely on the basis of thisdeduce that all of them came from the same tree. We are not deductively war-ranted in making generalizations using predicates that explicitly or tacitly referto entities that are not present in the current situation. Our world is closed onlywith regard to those entities that are present and the properties and relationsthat they have to one another. We are not deductively warranted in makinggeneralizations about things that are not present.

11 Varying Domains

In the previous section, we appeal to the idea that certain entities can be presentin a situation and others not present. In order to make sense of this distinctionin our semantics, we need to have the domains of individuals vary from situationto situation. In (19), we have a single domain of individuals, but Goldblatt andI have modi�ed this semantics to allow for variable domains.First, we add an existence predicate �E� into our language. We take our

quanti�ers to have existential import. That is, when we use 8 we mean �everyexisting thing�. The same holds for the existential quanti�er, but as we have saidwe are avoiding mention of it in this paper. We change our natural deductionrules for the universal quanti�er as well. Our elimination rule becomes:

8xA�E��...

A[�=x]�[� 8Ee

where x is free for � in A(x). This rule tells us that when we instantiate auniversally quanti�ed variable we assume that the term we are using refers to

16

something that exists. The introduction rule is similar:

y�������Exfkg...

A(y)�8xA(x)��fkg 8Ie

This version of the introduction rule in some ways makes more sense than theoriginal version. In the original version, we are told to assume the existence ofan object just called �x�and then show that it is A, but the assumption isn�texplicitly stated in the proof (at least not in the object language). Here theexistence assumption is explicitly stated in the proof in the object language.In this version of the semantics, having a piece of general information,

8xA(x), is just to have the licence to infer from the existence of somethingto its being A. Thus, if we have that licence we are in possession of that gen-eral information and vice versa. Thus, now our information condition for theuniversal quanti�er says:

s j= 8xA if and only if there is a proposition � such that(i) � entails jEx! A[c=x]j for every name c and (ii) s is in �:

Thus, a general piece of information has become a licence to infer from theexistence of an entity to its having a property of some sort.

12 Summary

In this paper, I set out a theory of the universal quanti�er in an informationalsemantics. According to this theory, the information that 8xA is carried by asituation s if and only if a proposition obtains at s which entails every instanceof A. This proposition is called a �general proposition�. This informationcondition for �all� is more reasonable then the standard Tarskian condition inan informational semantics, since informational semantics contain indices withpartial information about the world. If we do not know that a list of facts isexhaustive in the salient ways, then we are not entitled to draw general con-clusions from it. More information is needed. The use of general propositions,thus, �lls the informational gap between the particular facts contained in a sit-uation and generalizations that we draw from that situation. I argue that thisuse of general propositions is mirrored by the way that we actually make gen-eralizations in very di¤erent sorts of circumstances. At the end of the paper, Ibrie�y present a modi�cation of the view to allow the domain of individuals tovary from situation to situation.

13 Appendix: The Formal Semantics

The model theory I present here is essentially the same as (19), but there is animportant change. The model does not contain a set of propositional functions

17

as in (19), but instead distinguishes between premodels and models, followingGoldblatt and Hodkinson (13). If the reader compares the current approach tothat of (19), he or she will see that the current approach is signi�cantly simplerand makes the soundness proof much easier.A constant domain model structure is a sextuple S =< S; 0; R; C; I;Prop>

such that the following hold. S is the set of situations of the model. 0 is anon-empty set known as the �logical situations�. It is at the situations in 0 thatthe theorems of the logic are veri�ed. R is a ternary relation on S. R is usedto treat implication � it is used to formalize the theory of situated inference(see chapter 3 of (17)). C is the compatibility relation discussed in section 4above � it is a binary relation on S. I is the set of individuals of the model.We need only specify that I is a non-empty set (of individuals). Prop is the setpropositions �it is a set of subsets of S.In order that this sextuple be a model structure for the logicQR it also must

obey the semantic postulates given below. Where s v t i¤ 9x(x 2 0 & Rxst)and s� is a v-maximal situation compatible with s,

SP1 v is transitive and re�exive (not necessarily anti-symmetric).

SP2 Rsss (full re�exivity, to guarantee ((A! B) ^A)! B):

SP3 If Rstu, then Rsut (to guarantee A! ((A! B)! B):

SP4 If 9x(Rstx & Rxuv), then 9x(Rsux & Rxtv) (this is the so-called Paschpostulate to guarantee the transitivity of implication):

SP5 If Rstu and s0 v s, then Rs0tu (this guarantees the persistence of impli-cational information, i.e. if a j= A! B and a v b, then b j= A! B):

SP6 For all s 2 S; s� exists.

SP7 If Rstu, then Rsu�t�(this guarantees that contraposition holds).

SP8 s�� = s (this guarantees that double negation elimination holds).

Before we can de�ne models for our language, we need to set out the languageitself. It is a standard �rst order language with relation symbols of variousarity, individual constants, countably many individual variables (x1; x2; :::), theunary connective :, the binary connectives ^ and !, the quanti�er 8, andparentheses. The logic QR, which is here being modelled, is in a Hilbert-syle axiomatization Anderson and Belnap�s logic R (see (2)) together with theaxiom, 8xA ! A[�=x], where x is free for � in A, and the rule ` A ! B =)` A! 8xB, where x does not occur free in A.As I said above, I follow (13) in distinguishing between premodels and mod-

els. Here is their de�nition of a premodel (adapted from modal logic to relevantlogic). A premodel for QR is a pair M =< S; j � jM > where S is a modelstructure and j � jM is an interpretation function on the language. j � jMthatassigns

18

� to n-place relation symbol of our language P , a function jP jM : In �!Prop;

� to each individual constant c, an individual jcjM 2 I.

The interpretation is then used to give information conditions for all the sen-tences of the language, by means of a fairly standard inductive de�nition. Beforewe give the rest of the de�nition, we need the notion of a variable assignmentfunction. An assignment function f is a function from the natural numbers !into the set of individuals I. If f(n) = i, then f assigns i to xn. We use thesymbol �f [i=n]�to mean a variable assignment function just like f except thatit assigns i to xn. Moreover, for any term (individual variable or constant) � ,�Mf is j� jM if � is a constant and f(n) if � is xn. Now we can give the inductiveclauses de�ning j � jMf :

� jP (�1:::�n)jMf = jP jM(�M1 f; :::; �Mn f) 2Prop;

� jA ^BjMf = jAjMf \ jBjMf ;

� j:AjMf = fs 2 S : 8x(Csx � x =2 jAjMf)g;

� jA! BjMf = fs 2 S : 8x8y((Rsxy & x 2 jAjMf) � y 2 jBjMf)g;

� j8xnAjMf = ui2I(jAjMf [i=n]):

Where X is a set of sets of situations, ui2IX = [f� 2Prop: � � \Xg.A premodelM is amodel if for all assignments to variables f and all formulas

of the language A, jAjMf 2Prop. A formula A is valid on the class of all modelsif for each modelM, 0 � jAjMf . In order to de�ne a valid deduction, we de�nea generalization of our ternary accessibility relation:

R0st i¤ s v t

R1stu i¤ Rstu

Rn+1s1:::sn+1sn+2t i¤ 9x(Rns1:::sn+1x & Rxsn+2t):

Since it is understood that these relations are just products of the originalternary relation, we drop the superscripts on the symbol R. A deduction of Bfrom A1f1g; :::; A

nfng is valid on a model if and only if, for every assignment f and

for all situations s1; :::; sn and t, if Rs1:::snt and sk 2 jAkjMf (for 1 � k � n),then t 2 jBjMf . It can be seen with a little work that the rules of 8I and 8Epreserve validity in models (as do the other rules discussed above).In order to accommodate variable domains, we add to our language a unary

predicate �E�. �E��means �� exists�. We also add to our de�nition of a modelstructure a functionD from situations to subsets of I. Then we have s 2 jE� jMfif and only if �Mf 2 Ds. This simply says that an entity exists at a situation ifit is in the domain of that situation. We modify our information condition forthe universal quanti�er so that it now says:

j8xnAjMf = ui2I jExn ! AjMf [i=n]

19

We now have a semantics that validates 8Ie and 8Ee. The logic that resultsfrom adding these rules to the rules for Anderson and Belnap�s R is both soundand complete over this semantics. In order to axiomatize this logic, we add tothe logic R the axiom 8xA ! (E� ! A[�=x]), where x is free for � in A, andthe rule ` A ! (Ex ! B) =) ` A ! 8xB, where x does not occur free inA.13 ,14

References

[1] A. Ambrose �Mathematical Generality�in M. Lazerowitz and A. Ambrose,Necessity and Language, London: Croom Helm, 1985, 101-120

[2] A.R. Anderson, N.D. Belnap, and J.M. Dunn, Entailment: Logic of Rele-vance and Necessity, Volume II, Princeton: Princeton University Press,1992

[3] D.M. Armstrong, A Combinatorial Theory of Possibility, Cambridge: Cam-bridge University Press, 1989

[4] D.M. Armstrong, A World of States of A¤airs, Cambridge: CambridgeUniversity Press, 1997

[5] D.M. Armstrong, Truth and Truthmakers, Cambridge: Cambridge Univer-sity Press, 2004

[6] J. Barwise �Constraints, Channels, and the Flow of Information� in P.Aczel, D. Israel, Y. Katagiri, and S. Peters (ed.), Situation Theory andits Applications, volume 3, Stanford: CSLI, 1993, 3-27

[7] J. Barwise and J. Etchemendy, The Liar, Oxford: Oxford University Press,1987

[8] J. Barwise and J. Perry, Situations and Attitudes, Cambridge, MA: MITPress, 1983

[9] Ivan Bratko, Prolog Programming for Arti�cial Intelligence, Workingham,UK: Addison-Wesley, 1986

[10] J.M. Dunn �Star and Perp�Philosophical Perspectives 7 (1993): 331-57

[11] K. Fine �Semantics for Quanti�ed Relevant Logic�Journal of PhilosophicalLogic 17 (1988): 27-59; reprinted in (2) §53.

13We allow statements about things that do not exist at a situation to be true at thatsituation (in my opinion, we want to allow statements such as �if Sherlock Holmes is a famousdetective, then he is famous�to be true in some actual situations).14Acknowledgements: I am especially grateful to Rob Goldblatt, with whom I have devel-

oped the theory of quanti�cation discussed in this paper and to Sebastian Sequoiah-Graysonwho commented extensively on an early version. Early drafts of this paper were read at theAustralasian Association of Philosophy meeting in Wellington, Korea National University, andthe Workshop on Philosophy of Information and Logic in Oxford. This paper was also givenas the 2007 Alice Ambrose Lazerowitz/Tomas Tomoczko Memorial Lecture at Smith College.I am grateful to those audiences, especially Phil Bricker, Luciano Floridi, Jay Gar�eld, CarrieJenkins, Sehwa Kim, Dan Nolan, Josh Parsons, Johann Van Benthem, and Mark Jago.

20

[12] K. Fine �Incompleteness of Quanti�ed Relevance Logics�in J. Norman andR. Sylvan (eds.), Directions in Relevant Logic, Dordrecht: Kluwer, 1989

[13] R. Goldblatt and I. Hodkinson �Commutativity of Quanti�ers in Varying-Domain Kripke Models�in D. Makinson, J. Malinowksi, and H. Wans-ing (eds.), Trends in Logic: Towards Mathematical Philosophy, Heidel-berg: Springer, forthcoming

[14] R. Goldblatt and E.D. Mares �A General Semantics for Quanti�ed ModalLogic� in G. Gouvenatori, I. Hodkinson, and Y. Venema (eds.), Ad-vances in Modal Logic, Volume 6, London: College Publications, 2006

[15] J. Heil, From an Ontological Point of View, Oxford: Oxford UniversityPress, 2003

[16] E.D. Mares �A Star-Free Semantics for R�The Journal of Symbolic Logic71 (2006) pp 163-187

[17] E.D. Mares, Relevant Logic: A Philosophical Interpretation, Cambridge:Cambridge University Press, 2004

[18] E.D. Mares �The Semantics of Denial�in B. Brown and F. Lepage (eds.),Truth and Probability: Essays in Honour of Hughes Leblanc, London:College Publications, 2006, 49-58

[19] E.D. Mares and R. Goldblatt �An Alternative Semantics for Quanti�edRelevant Logic�The Journal of Symbolic Logic (2006)

[20] G. Restall �Information Flow and Relevant Logics�in J. Seligman and D.Westerståhl (eds.), Logic, Language and Computation, Stanford: CSLI,1996, 463-477

[21] B.A.W. Russell, The Philosophy of Logical Atomism in Russell, CollectedPapers, Volume 8, London: George Allen and Unwin, 1986, 157-244

21