Transcript
Page 1: What linguists are talking about when talking about…

Language Sciences 45 (2014) 56–70

Contents lists available at ScienceDirect

Language Sciences

journal homepage: www.elsevier .com/locate/ langsci

What linguists are talking about when talking about.

David J. LobinaFaculty of Philosophy, University of Oxford, Radcliffe Humanities, Radcliffe Observatory Quarter, Woodstock Road, Oxford OX2 6GG,England, United Kingdom

a r t i c l e i n f o

Article history:Received 2 March 2014Received in revised form 18 May 2014Accepted 28 May 2014Available online

Keywords:Recursive proceduresRecursive rewriting rulesEmbedding operationsRecursive structuresFour conflations among the aforementionedconstructs

E-mail address: [email protected]

http://dx.doi.org/10.1016/j.langsci.2014.05.0060388-0001/� 2014 Elsevier Ltd. All rights reserved.

a b s t r a c t

A historical look at the manner in which recursion was introduced into linguistics,including how it was used thereafter, shows that Chomsky, the scholar who popularisedthe use of recursive techniques in linguistics, has always understood this notion to be acentral feature of generative procedures, much as it was treated in mathematical logic inthe 1930–50s. Recursion is the self-reference property that underlies all types of recursivefunctions; recursive definitions (or definitions by induction), in addition, justify everystage of the computations effected by computational procedures such as Post productionsystems or set-operators like merge, making recursion the central feature of a generativegrammar. The contemporary literature, however, has confused this recursive property of agrammar with other constructs, such as self-embedded sentences, self-embedding oper-ations, or certain rewriting rules, thereby obscuring the role of recursion in the theory oflanguage. It is here shown that this is the result of the literature implicitly endorsing anumber of unwarranted conflations, four of them analysed here. It is concluded that mostof the discussion on the centrality and uniqueness of recursion in human language and/orgeneral cognition has been confusing and confused for very fundamental reasons; a storyof conflations, in a nutshell.

� 2014 Elsevier Ltd. All rights reserved.

1. In the way of a justification

The world possibly doesn’t need another article on the role of recursion in language. Ever since Hauser et al. (2002)hypothesised that recursion may be the only sui generis feature of human language, a barrage of publications has made itsway into print, many of these engaging very different aspects of the hypothesis. It has been claimed by some, supposingrecursion to refer to self-embedded sentences such as the cat [the dog [the mouse bit] chased] ran away, that recursion couldn’tpossibly be the central property of language on account of the apparent fact that not all languages exhibit such structures(among others, Parker, 2006; Everett, 2012). Others, equating this recursive property with an embedding or self-embeddingoperation, have pointed out that such an operation appears to be part of the grammar of all languages (e.g., Hauser, 2009 and,to some extent, Nevins et al., 2007). Many more issues have been discussed in the literature, of course; my intention here,however, is not to catalogue them all but to organise our knowledge of all things recursive; that is, I offer a conceptualanalysis, and high time for such an analysis it is.

Granted, similar analyses have been offered before, the most prominent perhaps being those of Tomalin (2007) and Fitch(2010); as I will show below, however, there are some problems with these two publications (in terms of both the quality andthe quantity of what they say), not least the fact that they are not entirely compatible with each otherdindeed, they take

k.

Page 2: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–70 57

recursion to be different thingsdand that is an issue that must be addressed. In any case, neither of these two papers, or anyother from the literature for that matter, seem to have hadmuch of a positive effect; much confusion remains, as evidenced ina recent paper that attempts to clarify what recursion is by focussing on mathematical logicdnamely, Watumull et al.(2014)dbut which commits multiple, and very obvious, mistakes (surprisingly, neither Tomalin, 2007 nor Fitch, 2010 arereferenced).

By quoting a text of Gödel’s (1931) inwhich the noted mathematician offers a definition of what he therein calls “recursivefunctions” (these are now known as the primitive recursive class; Davis, 1965, p. 4), Watumull et al. (2014), WEA in thisparagraph and the next, identify ‘three criterial properties’ of what they call the ‘primitive notion of recursion’ (p. 2): a) arecursive function must specify a finite sequence (Turing computability, WEA claim); b) this function must be defined interms of preceding functions, that is, it is defined by induction, which WEA associate with strong generativity (i.e., thegeneration of ever more complex structure); and c) this function may in fact just reduce to the successor function (that is,mathematical induction, which WEA associate with the unboundedness of a generative procedure). However, Gödel’s textdoesn’t say that at all, as the full quote demonstrably shows (the italicised words below mark the material WEA purposelyomit from their citation of Gödel (1931); I’mquoting fromDavis (1965), which offers a different translation from the oneWEAuse, but this doesn’t affect my point in the slightest):

1 Thainducti

2 In whopefu

A number theoretic function f is said to be recursive if there exists a finite sequence of number-theoretic functionsf1,f2,.,fn which ends with f and has the property that each function fk of the sequence either is defined recursivelyfrom two of the preceding functions, or results [footnote not included] from one of the preceding functions by substi-tution, or, finally, is a constant or the successor function x þ 1 (pp. 14–5; underline in the original, my italics).

In this quote, Gödel is simply identifying which functions from a finite list are to be regarded as primitive recursive (cf. thedefinition of this class in Kleene, 1952, pp. 220–3, which is very similar indeed)1; what the text is clearly not offering is acombination of properties subsuming the recursion conceptdand in any case, by omitting the material in italics, WEA end upwith three ‘criterial properties’ by design (whatever happened to substitution or a constant?). More importantly, there are nogrounds for identifying Gödel’s finite sequence of functions with Turing computability (a notion that was unavailable to Gödelat the time in any case), and it is certainly a mistake to equate a definition by induction with strong generativity, or thesuccessor function with mathematical induction and/or unbounded computationsdall these are related but clearly inde-pendent concepts. Having said that, WEA’s focus on mathematical logic is to be welcomed, but the clear deficiencies in thatpaper must also be addressed.2

Given the current state of affairs, then, no-one could be faulted for wondering what it all means; or what it could all mean,in any event. Despite the documentary density, I advance that there is some novelty to be had, especially for the philosopher,and of some worth to the engorged literature to boot. To that end, I start this essay with a brief description of how recursionhas been understood within the formal sciences, especially in the fields of mathematical logic and computer science. I thenchronicle both the introduction of recursion into linguistic studies and how its role within generative grammar has evolvedover the ensuing decades; to this end, I will focus on Noam Chomsky’s writings, the scholar responsible for introducingrecursive techniques into linguistics. Rather surprisingly, given the prominence of this notion in the literature, such an ex-ercise is yet to be done, let alone offered for reflection. This type of analysis, however, brings to light a number of importanthistorical facts: a) recursionwas first employed in the 1950s and 60s in the samemanner as it was used inmathematical logic,a field that exerted a great influence on linguists at the time; and b) its application has not translated much over the years, atleast as far as Chomsky’s individual writings are concerned. Building upon that, I then demonstrate that the confusion sur-rounding this concept is not solely a matter of imprecision, as claimed by Tomalin (2011), but a story of conflations: betweenrecursive mechanisms and recursive structures; namely, between the self-reference so typical of recursive functions and self-embedded sentences; between the recursive applications of specific rules of Post production systems and self-embeddingoperations; and, lastly, between what an operation does and how it applies. As a result, I will conclude, most of the dis-cussion in the literature as to the centrality and uniqueness of recursion in natural language has centred on issues (such aswhether all languages exhibit self-embedded sentences) that have little to do with the introduction of recursive tools intolinguistics, let alone the reasons for introducing such techniques in the first place. As such, then, some of the strongest claimsto be found in the literature are either fallacious or quite simply misplaced.

2. What is recursion, then?

As Brainerd and Landweber (1974) put it, it is useful to define functions ‘using some form of induction scheme., a generalscheme.which we call recursion’ (p. 54). This is in fact consonant with the original interpretation of recursion withinmathematics as being part of a definition by induction, as chronicled by Soare (1996). Also known as a recursive definition, itconsists in ‘defining a function by specifying each of its values in terms of previously defined values’ (Cutland, 1980, p. 32); a

t is, what Gödel is saying is that if we have a list of functions, any one function from this list will be defined as (primitive) recursive if it is defined byon from previous functions, OR is substituted by some of them, OR is the constant function, OR is the successor function.hat follows, I will not discuss the finer details of WEA’s mistakes, as this would take us far outfield; instead, I will offer my own narrative, which

lly will provide a much better grounding for the issues at stake here.

Page 3: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–7058

self-referential characteristic (Tomalin, 2006, p. 61). As an example, take the factorial class (fact(n)¼ n� n� 1� n� 2�.1),which can be recursively defined in the two-equation system so common of such definitions, as follows: if n ¼ 1, thenfact(n) ¼ 1 (base case); if n > 1, then fact(n) ¼ n � fact(n � 1) (recursive step). Note, then, that the recursive step involvesanother invocation of the factorial function. Thus, in order to calculate the factorial of, say, 4 (i.e., 4 � 3!), the function mustreturn the result of the factorial of 3, and so on until it reaches the factorial of 1, the base case, effectively terminating therecursion.

There are a number of different types of recursive functions (the primitive, the general, the partial), and these are allrecursive for the very same reason: they are all underlain by the induction scheme, even if they encompass different input–output pairs (hence, the different classes).3 As is well-known, these functions proved to be very important in the 1930s andbeyond as a means to formalise the class of computable functions. Church (1936), in particular, identified the generalrecursive functions with the computable class, but this proposal was not entirely correct (see Soare, 1996, pp. 289–91; Sieg,1997 for details). Kleene (1938) replaced the general class of recursive functions with the partial class for the purposes Churchhad in mind, and this type of recursive functions was further polished byMcCarthy (1963), first, and recently byMoschovakis(2001). From a completely different perspective, that of a generative system that lists a set of integers rather than computing afunction, Post (1943) introduced his “canonical systems”, a construct that directly stems from the “generalisation bypostulation” method of an earlier work (namely, Post, 1921). According to Post (1943), canonical, or production, systems canbe reduced to what he calls a “normal form”, which can be described in terms of the mapping from gP to Pg0, where g standsfor a finite sequence of letters (the enunciations of logic) and P represents the operational variables manipulating theseenunciations (p.199). Crucially, thewhole approach ‘naturally lends itself to the generating of sets by themethod of definitionby induction’ (p. 201), a fact of production systems qua formalisation of a computational system that has moved Sieg (1997) toclaim that it is most natural to consider the generative procedures underlying algorithms to be ‘finitary inductive definitions’(p. 166).4 Furthermore, Post (1944) calls the objects his production system generates either recursively enumerable sets or(general) recursive sets, pointing to the fact that every set so constructed is a recursively-defined object (that is, everyoperation of a production system is a definition by induction), thereby placing recursion at the very heart of these generativesystems.

In fact, Soare (1996) describes the spirit of the times of the 1930s and 40s as one in which recursion is taken to be at thevery centre of what a computation is, to the point that systems of recursive equations, or recursion itself, are employed asbeing almost synonymous with computability and/or computable, in detriment of Turing’s (1936) model, which did not makeuse of recursion at all.5 This state of affairs, what Soare calls the Recursion Convention, involves the following practices: a) usethe terms of the general recursive formalism to describe results of the subject, even if the proofs are based on the formalism ofTuring computability; b) use the term Church’s Thesis (in a narrow sense: the identification between general recursivefunctions (or partial recursive functions) and the computable functions) to denote various theses, including the Turing Thesis(viz., all intuitively computable functions are Turing Machine-computable.); and c) name the subject using the language ofrecursion (e.g., recursion function theory). This issue will be of some importance later on, but I nowmove on to a descriptionof the manner in which computer scientists have used recursive techniques.

Naturally, the way in which computer scientists employ recursion is similar to that of mathematicians, but some differ-ences are worth pointing out. In a programming language such as LISP, for instance, the step-by-step list of instructions forcompleting a taskdthe proceduredcan contain steps that are defined in terms of previous steps, much as recursive defi-nitions (Abelson and Sussman,1996; SICP, from now on). In the case of the factorials, the recursive step is rather obvious in itsLISP procedure (note the Polish notation): (if (¼ n 1)), 1; (* n (factorial (� n 1))) (SICP, pp. 33–4). More importantly, proceduressuch as these generate computational processes in which an operation calls itself during the course of its real-time imple-mentation, thereby creating chains of deferred operations, the hallmark of recursive processes. The factorials, however, can becomputed non-recursively as well (that is, without self-calls and chains of deferred operations), and it is a well-establishedresult of the literature that any task can be solved either recursively or iteratively.6 Given that an iterative computation doesn’tcarry deferred operations, it is usually a less costly way of solving a task, where “cost” refers to the memory load incurred bythe architecture implementing a given computation, an issue that doesn’t concern the mathematician as much as thecomputer scientist; that being the case, some attention has been devoted to those tasks that naturally call for a recursive

3 Having said that, though, it would be a mistake to claim that each class involves a different understanding of recursion. That is, just as the primitiveclass doesn’t yield a primitive notion of recursion, as WEA seem to suggest, the general or partial classes don’t provide general or partial notions of whatrecursion is.

4 We might, however, point out that Kleene (1952, pp. 260 et seq.) draws a distinction between definitions by induction, or recursive definitions, andinductive definitions, the latter also known as mathematical induction. The focus here will fall on recursive definitions, and mathematical induction won’tbe discussed much, if at all.

5 Note that even though Turing’s model was shown to be equivalent to the partial recursive functions or Post’s production systems, this is an extensionalequivalence: all these formalisms converge on the same output given a specific input. However, there are intensional differences among these formalisms,as the input–output pairs are achieved differently. In fact, Moschovakis and Paschalis (2008) call algorithms based on recursive definitions recursors andthose based on a Turing Machine iterators.

6 This already follows from the extensional equivalence mentioned in the previous footnote: given any input–output pair, such a pair can be computedrecursively by a production system as much as it can be computed iteratively by a Turing Machine. In any case, further support for this contention can befound in Rice (1965), where it is demonstrated that all recursive relations can be reduced to iterative relations, and in Liu and Stoller (1999), who provide an“optimization technique’ that can automatically transform recursive computations into iterative ones.

Page 4: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–70 59

solution. Roberts (2006) has drawn attention to complex tasks that can be divided into simpler instances of the overallproblem (think of the recursive definition of the factorial class, for instance), whilst Wirth (1986) has centred his attention ona type of data structure that corresponds to recursive procedures, the ‘[v]alues of such a recursive data type [containing] oneor more components belonging to the same type as itself’ (p. 172).

The postulation of recursive data structures points to a difference between mathematical logic and computer science, asthe latter field posits such objects independently of whether they are generated ormanipulated by recursive computations. Ingeneral terms, Rodgers and Black (2004) define a recursive data structure as an object or class ‘that is partially composed ofsmaller or simpler instances of the same data structure’.7 That is, a structure that includes an abstraction of itself (an X withinan X), “trees”, “lists” and the like constituting the prototypical cases (trees inside other trees, or lists inside lists, etc.). Incontrast, mathematicians restrict the recursive property to functions or rules and never apply it to objects (I will come back tothis below).8

In any case, it is important to emphasise that every recursive construct here delineated (functions, procedures, processes,data structures) has been shown to be recursive for the very same reason: each one of them contains a self-reference or self-call. Having established what recursion is, I now turn to the introduction of recursive techniques into linguistic theory.

3. The historical facts

The origins of generative grammar are of course to be found in the writings of Chomsky in the 1950s and 60s, and thatstretch will be ourmain focus, at least to beginwith. This statement doesn’t obviate the fact, let it be clear, that Chomskymadeuse of various contemporary ideas in the process of constructing his grammar; his great achievement, I suggest, was to puttogether a very original synthesis of these ideas, and going beyond them.9 Some of these ideas might be mentioned in whatfollows, but they won’t be terribly important for our purposes. In any case, the two most important works of Chomsky’s fromthe 1950s are, as far as our purposes are concerned, The logical structure of linguistic theory (LSLT), written in 1955–6 butpublished in 1975, and a small portion of that work that was published in 1957, his Syntactic structures book. Despite theirprominence, however, it will be more instructive to look at some of the works Chomsky published in the 1960s first, as theseprovide a better grounding for the foundational underpinnings his generative grammar was to have; we will come back tothese two works soon enough.

In a series of papers in 1963 (Chomsky andMiller,1963; Chomsky,1963;Miller and Chomsky,1963), we find the first amplediscussion of issues such as the distinction between competence and performance, the levels of description a theory mustmeet, the expressive power of natural language, and many more. In particular, Chomsky and Miller (1963) outline the formatof the generative procedure underlying language, ourmain point of contention here. Adopting the production systems of Post,Chomsky and Miller (1963) explicitly recognise that this computational formalism is underlain by recursion when they statethat the / relation mediating the conversion of some structure f1,.fn into some structure fnþ1 can be interpreted as‘expressing the fact that if our process of recursive specification generates the structures f1,.fn, then it also generates thestructure fnþ1’ (p. 284). According to Pullum (2011, p. 288), this is no more than a summary of Post production systems, andthis is certainly correct. The main point to keep in mind is that a definition by induction is a central property of productionsystemsdindeed, recursion justifies the very transformation from the lefthand side of the arrow to the righthand side; or inother words, every set so generated is a recursively-defined object. As shown above, it was Post himself who drew attention tothis point, and this seems to have been acknowledged by Chomsky from the very early stages of generative grammar.

The influence of mathematical logic on Chomsky’s thought was very deep indeed. In Chomsky (1965), it is stated that agrammar ‘must meet formal conditions that restrict it to the enumeration of recursive sets’ (p. 208), and similar statementsare offered in the first edition of his Language and mind book, re-edited in 2006 but originally published in 1966: ‘in general, aset of rules that recursively define an infinite set of objects may be said to generate this set’ (p. 112); and, ‘generative grammarrecursively enumerates structural description of sentences’ (p. 165). The first quote here is a bit confusing, though, as I thinkChomsky meant to write the recursive enumeration of sets rather than the enumeration of recursive sets. Recursivelyenumerable and (general) recursive sets were introduced by Post (1944) as the objects over which his production systemsrangeddthe objects produced, that is. A set is recursively enumerable if there is a mechanical procedure that can list/enumerate all its members, while a set is (general) recursive if there is an algorithm that can determine whether a givenelement is (or is not) one of its members. Given that Chomsky (1980, pp. 119–26) discusses the issue of whether a grammar

7 The reference can be found at http://xlinux.nist.gov/dads/HTML/recursivstrc.html.8 Cf. Gersting (1982, p. 131), who lists four different constructs to which recursion can be applied within computer science: a recursive sequence (wherein

the first one or two values in a sequence are known, and subsequent items are defined in terms of earlier ones); a recursive set (wherein a few specific itemsare known to be in a set and the other items are built from combinations of items already in the set); a recursive operation (wherein a “small” case of anoperation gives a specific value and the other instances of the operation are defined in terms of the smaller cases); and finally, a recursive algorithm(wherein the behaviour of an algorithm is known for the smallest values of an argument, while for larger values of an argument, the algorithm invokes itselfwith smaller argument values).

9 I thank Mark Brenchley for drawing my attention to this issue.

Page 5: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–7060

ought to seen as a device that generates recursive sets (that is, as an algorithm that determines whether a given sentence isgrammatical or not) and dismisses it, I take it that in retrospect the attention in his 1965 book was on recursively enumerablesets rather than on the enumeration of recursive sets.10 As such, then, a generative grammar is no more than a recursivedefinition of a specific set, as Pullum (2007, p. 1) puts it.

The general recursive property of the grammar is also outlined in LSLT, I believe, especially in pages 194–5, but in theSyntactic structures book we encounter confusing talk of “recursive devices” in terms of the closed loops of finite-statemachines (Chomsky, 1957, p. 24). Be that as it may, there is a more prominent application of recursion within productionsystems in LSLT and later publications, and it is very important that we are very clear as to what this is, including what role itplays in the grammar. Naturally, production systems had to be adapted for linguistic practice, and thus the strings Postmanipulated were replaced with linguistic symbols (such as N, for nouns, NP for noun phrases, V for verbs, etc.), giving rise toa particular and internal application of recursion within specific rules. Namely, in those cases in which the same symbolappeared on both sides of the arrow, such as in the rule NP / N þ NP, capable of generating structures in which NPs areembedded inside other NPs, as in John’s [brother’s [teacher’s book]].

It is important to emphasise that this internal application of recursion within specific rulesdcall them recursive (rewrite)rulesdis very different from the general recursive property of production systemsdcall the latter recursive specification (of agenerative procedure). Indeed, the latter is quite simply a central feature of Post’s production systems qua finite, mechanicalproceduredevery set, recall, is a recursively generated onedand a wide-ranging look at Chomsky’s writings clearly showsthat his main interest has always laid on the general recursive property of the grammar and not on the internal application.

As stated, the internal application of recursion has been more prominent in the generative grammar literature overall.On one front, there has been some discussion regarding where in the grammar should one find recursive rules. In the earlyyears, the generative procedure was divided into two components: the base (composed of rewriting rules that returnedstrings with associated phrase markers) and the transformational system (a component that would convert phrase markersinto other phrase markers, preserving structure). In LSLT, Chomsky goes to great lengths to rid the first component of“recursive statements”, as recursive rules are called there. He codified this as a “non-recursion requirement” for the basecomponent (pp. 517–18), which he vigorously defended at length. In Chomsky (1957), consequently, the recursive propertyof certain rules is unsurprisingly ascribed to the transformational system, but Chomsky (1965) assigns it to the basecomponent instead.11

On another front, the focus on the internal application of recursion within specific rules was to produce rather importantresults for what came to be known as the field of formal language theory. Chomsky (1956) showed that production systemscould be employed to characterise different collections (classes) of formal grammars and the sequences of symbols (also calledstrings; these sequences constitute the corresponding formal languages) that these grammars are said to generate/produce. Aranking can then be so devised as to classify these grammars in terms of their expressive power; that is, in terms of the sort ofstrings a specific formal grammar can generate. Some of the most important classes include the context-free (with rules suchas A/ a and exemplars of the following type: an bn) and the context-sensitive grammars (with rules like fAj/faj that cangenerate an bn cn strings), both of whichmake use of recursive rules (the word “context” refers to the material that appears oneither side of the arrows). Chomsky (1956) also showed that certain linguistic expressions could not be generated by some ofthe grammars he identified; in particular, it was claimed that natural language could not be accounted for by a grammarbelow the context-free class, as recursive rules like S / aSb were necessary in order to generate self-embedded sentences.Given this rather specific point regarding a grammar’s expressive power, it is perhaps not surprising, as I will show, that theliterature sees such a close connection between recursion and self-embedded sentencesdmistakenly so, as I will arguebelow.12

I won’t be much concerned with either of these two fronts here. For now I just wish to point out that in Chomsky’saforementioned Language and mind book, wherein the recursive enumeration of a grammar was highlighted, we also findcomments that are hard to square with recursively-specified procedures. Indeed, in two different parts of that book therecursive property of language is actually identified with an embedding operation that forms [S.]s within other structures(p. 27), as in derivations inwhich an NP is rewritten into (DETerminer) N (that S), where S(entence) is reintroduced from anearly stage of the derivation (p. 128). Note, first of all, that what the recursive specification of a procedure does is no morethan recursively define every stage of a derivationdthat is, every object that a production system generates is a recursively-defined setdand this bears no relation to an embedding operation. Note, also, that all a production system does, at least asconceptualised by Post, is rewrite a string of elements into another string, making the whole approach a string substitutionsystem. It is therefore not obvious how one can relate an embedding operationwith a rewrite rule, be the latter recursive or

10 I should point out that the 1965 quote comes from a footnote that accompanies some speculation as to whether language is Turing-complete or no;however, as there is no obvious direct relation between the generation of recursive sets and Turing-completeness, I stick to my interpretation as to whatChomsky had in mind with that choice of words.11 As late as the 1970s, Chomsky was adamant that the base component was neither structure-dependent nor structure-independent (see his commentsin Piattelli-Palmarini, 1980, p. 314), the suggestion being, I believe, that the base component is a string-rewriting system through and throughdit is thetransformational component that is structure-dependent.12 Note that this front involves the issue of the expressive power of a grammar, what has come to be known as the “weak generative capacity” (i.e., thegeneration of strings); the linguist, however, is interested in “strong generative capacity” (the generation of structure), a notion that still awaitsformalisation.

Page 6: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–70 61

not, unless it is done by stipulation or conflation. After all, we described a rule-rewriting grammar above as a formalism inwhich the rewriting rules of the base component return strings for which a phrase marker (a syntactic tree, that is) isassociated, the latter then further operated upon by the transformation component, and this is a pretty standarddescription of a rewriting grammar. Such a take on things is very different indeed from a formalism inwhich an embeddingoperation is explicitly outlined. As a case in point, a tree-adjoining grammar, a formalism in which tree-rewriting opera-tions are in fact postulated (see Frank, 2004 for a brief description), exhibits very different properties and features from aproduction system.

In any case, by the 1970s and 80s, most of the rewriting rules were in fact being eliminated from the theory, perhapscompletely so by the time Chomsky (1986) was published. With the advent of so-called government and binding theory, theemphasis was placed on structural configurations and constraints, as it was then argued that rewriting rules were merelyrecasting lexical properties, and therefore were redundant (Chomsky, 1986, p. 83). Some of these lexical properties have to dowith the discovery that all syntactic phrases appear to respect the same geometry, a specifier–head–complement(s)configuration,13 and this information could be placed in the lexicon instead of having rewriting rules generate this basic typeof geometry for each phrase. This result appears to apply cross-linguistically, with the further corollary that from a specificpoint of view a sentence is nothing more than a complex of such specifier–head–complement(s) phrases, which suggests avery general type of recursive structure indeed (Moro, 2008; Lobina, 2011b).

Even though the emphasis translated from the generative procedure to the structural constraints of phrases during thegovernment and binding years, one global computational operation was retained: move-alpha. In fact, even during thisperiod Chomsky was still claiming that a generative grammar had to provide a recursive enumeration of linguistic objects(Chomsky, 1981, pp. 11–3), and one can only assume that he had move-alpha in mind for such a jobdafter all, if there isn’t acomputational operator, there won’t be an enumeration of any kind. In the 1990s, though, the minimalist program redirectedlinguistic theory to the study of the mapping function from lexical items to the interfaces, and one single mechanism wasidentified for this purpose: merge (Chomsky, 1995b). Moreover, Chomsky has been rather clear that recursion underliesmerge, as it is a procedure that ‘recursively constructs syntactic objects from [lexical] items.and syntactic objects alreadyformed’ (Chomsky, 1995b, p. 226).14

A recent description delineates merge in very general terms as a set-theoretic operation in which repeated applicationsover one element yield a potentially infinite set of structures, drawing an analogy between the way merge applies and thesuccessor function (Chomsky, 2008; cf. Kleene, 1952, p. 21). The successor function also underlies what is known as the“iterative conception of set” (Boolos, 1971; Forster, 2008), a process in which sets are ‘recursively generated at each stage’(Boolos, 1971, p. 223), which has moved Forster (2008) to state that what mathematicians really mean by the iterativeconception of set is, ‘in the terminology of modern computer science at least’, the recursive conception of set (p. 97). Strictlyspeaking, this is incorrect; for a process to be recursive, according to the ‘terminology of computer science’, it must containchains of deferred operations, the result of an operation calling itself, but this is not the case here. Rather, by ‘recursivelygenerated at each stage’ we understand the ‘repeated application of the successor function’, drawing our attention to theanalogy between ‘the way sets are inductively generated.and the way the natural numbers.are inductively generated from0’ (Boolos, 1971, p. 223). The process really is iterative, it just happens that every stage of the iteration is recursively gen-eratedda subtle distinction between process and generation. In fact, Chomsky (2000), to bring this historical account to anend, clearly states that the language faculty is a recursive definition of a set of expressions (p. 98).15

Note, then, that the place of recursionwithin the minimalist program is exactly the same as it was at the very beginning ofgenerative grammar: the central part of the grammar is a recursively-specified generative procedure. Surprisingly, very fewscholars have recognised this aspect of Chomsky’s theory, even while tracing the history of how production systems werereplaced by merge, as is the case with Bickerton (2009). As it happens, Bickerton fails to recognise the general recursiveproperty of either production systems ormerge. Failing to spot the latter is perhaps the result of not understanding what theiterative conception of set entails, while the former can only be put down to an unawareness of the relevant facts concerningPost’s systems. As a result, Bickerton is left wondering why Chomsky keeps insisting on this recursive property, given that hecannot see it anywhere.

An exception is to be found in the works of Tomalin’s (2006, 2007, 2011), albeit with some discrepancies to what I haveattempted here. According to Tomalin (2007), the term recursion is best avoided in linguistics and replaced by “inductivedefinition”, its original meaning in any case, given that recursion is taken to mean very different things by different people. Iam certainly in agreement with Tomalin’s contention that definition by induction (what he calls, confusingly, an inductivedefinition; see supra) is the original meaning of recursion, andwhilst the narrative he offers regarding its place in linguistics issimilar to what I have offered here, I also believe that it is not quite right (let alone as detailed as what I am trying to do here).

13 This is the result of so-called X-bar theory; see Moro (2008) for relevant comments.14 Note, nevertheless, that the aforementioned result regarding the general structure of linguistic objects remains. If so, then, all merge does is generatestructures of the following, skeletal kind: [.head.[.head.[.head.[.head.]]]]; it is only logical to suppose that all human languages do in factexhibit such structures.15 To keep with the analogy to the natural numbers, we may follow Benacerraf (1965) and ascribe to syntactic objects the same properties he ascribes tothe natural numbers. Thus, we can advance that the set of grammatical sentences forms a “recursive progression”, as manifested in the overall ‘structurewhich the members of the sequence jointly exhibit’ (Benacerraf, 1965, p. 69). In this sense, linguistics, like arithmetic, may well just be the science that‘elaborates the abstract structure’ of a specific type of progression; namely, a recursively-specified, linguistic progression of objects.

Page 7: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–7062

In particular, Tomalin (2007) identifies two relevant excerpts from Chomsky (1995a), both of them used in the present essay,that he takes to be highly illustrative of the locus of recursion within the minimalist program; on page 226 of that work,Chomsky states that merge recursively constructs syntactic objects, and given that a syntactic object is in turn recursivelydefined (that is, it is defined by induction) on page 243, Tomalin (2007) concludes that merge should be viewed as ‘a pro-cedure that utilises inductive definitions’ (p. 1799). Note, then, that the inductive definition merge is said to use refers to therecursive definition of a syntactic object, whereas what I have defended here is that the correct place of recursion within thetheory of language Chomsky has constructed lies in the recursively-defined steps of a computation instead, making thetransition from production systems to merge a much smoother one.

That being said, however, Tomalin proceeds to list a number of theoretical constructs, most of them from computabilitytheory (some of which include l-definable functions and Turing computability; more are added in Tomalin, 2011), which heclaims are treated in the (linguistics?) literature as being synonymous with recursion, and I find this aspect of his discussionvery unconvincing indeed. First of all, in both Tomalin (2007) and Tomalin (2011) recursion is portrayed as having differentsenses in relation to different classes of recursive functions, but this is an odd position to hold, for these classes of functionsare all recursive in the same sense: they are all defined by induction. Of course, the different recursive functions effectdifferent input-output relations, but it doesn’t follow from that that recursion means something different in each case.Secondly, it is one thing to state that, say, l-definable functions are equivalent to partial recursive functions (or to a TuringMachine), it is another thing completely to claim that anyone is using the term recursion to mean a Turing Machine or the l

calculus, as Tomalin suggests. In any case, Tomalin has certainly not provided enough compelling evidence to prove that sucha state of affairs holds in linguistics (some of his claims are based upon one single citation, more often than not in the absenceof any context), let alone that this is the main problem regarding how linguists employ the term recursion. As such, then, I seeno reason to support his call to replace the term recursionwith an inductive definition; the problems, as I will show below, lieelsewhere.16

In any case, if what I have described in this section is indeed a correct assessment of the history of generativegrammar, what on earth is there to disagree about? What indeed. Before moving on to that, though, let us brieflycomment upon a question that is lurking in the background of this discussion: why the need for recursion in the firstplace? Ever since the advent of generative grammar, Chomsky has insisted that a theory of language must account forwhat he calls the discrete infinity of language, a property he claims to be reflected in the ‘behavior of speaker[s]who.can produce and understand an indefinite number of new sentences’ (Chomsky, 1957, p. 15).17 Moreover, he hasbeen adamant that recursion constitutes an explanation for this phenomenon; indeed, it is claimed in LSLT that it is the‘recursive character’ of phrase structure rules that allows for the ‘generation of infinitely many sentences’ (pp. 171–2). Theconnection between recursion and infinity has been maintained ever since; a recent statement of his has it that allrecursion means is discrete infinity, the need to enumerate the potentially infinite number of expressions (in Piattelli-Palmarini et al., 2009, p. 387).

There has been a lot discussion in the literature regarding whether the set of grammatical sentences of a language isreally infinite, and whether such a thing can even be proven with the tools of linguistics and mathematics (Langendoen andPostal, 1984; Langendoen, 2010; Pullum and Scholz, 2010). There’s also been some discussion on whether recursion andinfinity are necessary constructs of a theory of language, some arguing that these are either unwarranted idealisations orsimply unsupported assumptions (Luuk and Luuk, 2011; Tiede and Stout, 2010). It seems to me, however, that the issues athand have not been treated appropriately; in particular, these publications have failed to look at the aims and motivationsthat moved Chomsky to introduce recursion into linguistics. In one sense, whether one can prove the actual infinitude of alanguage is by the by, for Chomsky’s main point has always been that linguistic productivity clearly exceeds humanmemory, and thus that a derivational account of linguistic knowledge must be provided. According to Chomsky, it will notdo to provide a list of constructions or structural constraints that sentences must fit into, a computational system must bedelineated instead.

Given that this was the main motivation, it is not at all surprising that Chomsky looked to recursion function theory for amechanical procedure that would take the finite amount of information the brain holds and generates therefrom the verymany sentences we can produce and understand. After all, the results achieved in the 1930s and 40s by mathematicians andlogicians must have been rather influential in the world andmilieu inwhich Chomsky found himself in the 1950s; indeed, formany years Kleene (1952) was the main textbook on those topics, and as Soare (1996) has shown, the terms recursion andrecursive came to be regarded as being synonymous with computation and computable within the field, a state of affairs towhich Kleene’s book greatly contributed and that must have influenced Chomsky and many others significantly. I shan’t stopto consider whether these factors make the employment of recursion for the purposes Chomsky had in mind a less secureone, though. Whilst a valid point (one could imagine a non-recursive computational system, after all), I will assume that the

16 In addition, Tomalin does a bad job of chronicling both the introduction of recursion into linguistics and the subsequent developments this conceptundertook. His 2006 book analyses the influence the formal sciences had on Chomsky, but his focus there lies on work Chomsky did before Syntacticstructures, published in 1957. Surprisingly, the very last chapter of Tomalin’s book, republished as the 2007 paper, travels all the way into the future toChomsky (1995a), so no historical narrative is in fact on offer (it is as a result surprising that Tomalin calls this paper a reconsideration of recursion inlinguistic theory). As for his 2011 publication, his worries there are exclusively centred on Chomsky (1957), which is not the best place to look for whatChomsky took recursion to be.17 I think the word “indefinite” has it right in this quote, as I will explain in a bit.

Page 8: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–70 63

place of recursion in linguistics, at least as conceptualised by Chomsky, is justified and proper and proceed with the sort ofanalysis I advertised at the beginning of this essay.18,19

The following section will provide a rather rapid description of the work of many scholars, and I will certainly notdiscuss the finer details of any of the publications I will citedthat’s really not that important here. Instead, I aim tohighlight what it is that these scholars are talking about, rather than what it is that they are saying about what they aretalking about. That is, I want to lay out the actual topics these scholars are engaged with, an exercise that hasn’t so farbeen carried out in the literature, despite the main publications on the topic. In particular, I will show that even though allthese authors purport to be talking about recursion, that is not quite right; nor could it be said that these scholars arediscussing different aspects of recursion, for in most cases these papers equate recursion with a phenomenon that haslittle to do with it.

4. A story of conflations

What on earth to disagree about, I asked towards the end of the previous section. As a matter of fact, and as far as therecursive property of language goes, none of the publications I have cited so far gave rise to much criticism or even com-mentary, even though Chomskywas rather specific on this topic therein.20 It is not until the publication of Hauser et al. (2002)(HCF) that we find much disagreement in the literature, and this is perhaps unfortunate, for a number of reasons. First of all,recursion didn’t actually feature extensively in that article; rather, HCF constitutes an evolutionary and comparativeconjecture regarding the possibility that the language faculty may be the result of various systems and principles. Indeed,recursion is therein defined in very general terms as a neurally implemented (p. 1574) computational mechanism (p. 1573)that yields a potentially infinite array of discrete expressions (p. 1574) from a finite set of elements (p. 1571). Secondly, itappears that both Hauser and Fitch hold a different idea from Chomsky’s regarding what recursion actually is. As is clear inHauser (2009) and Fitch (2010), for example, these authors identify recursion with an embedding operation (an operationthat puts an object inside another, bigger object), whereas Chomsky’s focus has always lied on the recursive generation at theheart of the language faculty. Indeed, a self-embedding operation is unrelated and unrelatable to recursive definitions.Moreover, it seems to me that it is easier to connect HCF’s vague characterisation of recursion to Chomsky’s interpretationthan to what appears in either Hauser (2009) or Fitch (2010).

Be that as it may, an early commentary on HCF, Pinker and Jackendoff (2005), has it that recursion ‘refers to a procedurethat calls itself, or to a constituent that contains a constituent of the same kind’ (p. 203), the accompanying footnote sug-gesting that the “or” in that sentence is not to be understood as exclusive disjunction. Therein, indeed, we are told thatcomputer scientists usually distinguish between true recursion and tail recursion, the latter being ‘a procedure [that] invokesanother instance of itself as a final step (or, in the context of language, a constituent [withdDJL] an identical kind of con-stituent at its periphery)’ (Pinker and Jackendoff, 2005). No references are offered in this footnote, so we can’t know whatpractice of computer scientists’ they are actually referring to; in any case, I take “true recursion” to simplymean recursion, andpoint out that the distinction computer scientists usually draw in descriptions of recurrent computational processes is be-tween recursion and iteration. In fact, tail recursion is a subtype of recursion and bears a rather subtle relation to iteration, onethat is in some ways dependent upon the programming language being used.21

In any case, it is not clear why Pinker & Jackendoff summon computer science rather than mathematical logic, given thatthere are some differences in the manner in which these two disciplines treat the notion of recursion and Chomsky wasclearly influenced by the latter and not the former. As mentioned earlier, mathematicians have mainly focused on recursivefunctions and the sets these generate, whilst computer scientists have focused on recursive procedures, recursive processes,and recursive data structures. More importantly, the highlighted quote jumps from a property of a computational procedure

18 Contra Tiede and Stout (2010), then, it is not the case that infinity and recursion are interconnected notions that one of them must be assumed and theother derived from it. Rather, the unbounded and novel linguistic behaviour is the observationdthe explanandumdand a recursively-specified compu-tational system is the explanans. Contra Mukherji (2010, p. 213), in turn, self-embedded sentences are not the evidential basis for recursion, novel linguisticbehaviour is.19 As mentioned earlier, Sieg (1997) believes that it is most natural to regard generative procedures as ‘finitary inductive definitions’ (p. 166), which maybe taken as a suggestion that recursion is central to what a computation is. As Soare (2009) has pointed out, though, there is a difference between Post’ssystems, which are meant to generate (list) a set of integers, and the formalisms of Church or Kleene (general recursive and partial recursive functions,respectively), which aimed to identify the class of computable functions. The apparent equivocation between recursion and computation, the result, recall,of what Soare (1996) calls the Recursion Convention, may be the key to understanding Chomsky’s insistence upon recursion, but I will not pursue thatconflation here. We might also add that what used to be called the recursion function theory field is nowmost properly referred to as computability theory; inaddition, what Post used to call recursively enumerated sets and recursive sets are nowadays known as computably enumerated sets and computable sets. Iwill just state that in a recent set of interviews, Chomsky points out that infinite systems (such as language) were only properly understood oncemathematicians had formalised notions such as finite axioms, algorithms, and recursive procedures (his terms, in Chomsky, 2012, p. 64), and that statementtells us a great deal about where Chomsky was coming from, I believe.20 As an exception, Corcoran (1972) argues that Chomsky’s use of recursion wasn’t a great deal in itself, given that logicians had already masteredrecursive techniques by then. An interesting comment, I think, but one that centres on definitions by induction and their role in mathematical logic;certainly not a remark that meddles with the number of conflations to be listed here.21 Technically, a tail-recursive process is one in which the self-call takes place at the end of the computation, constituting the exit operation. However, theissues are a bit more complicated than that, as discussed in SICP, pp. 33 et seq. Therein, SICP point out that in LISP a tail-recursive process is one in which arecursive procedure produces an iterative process.

Page 9: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–7064

(a tail recursive one) to a property of structures (complex constituents and the like, the linguistic context Pinker & Jackendoffallude to), but the transfer from one to the other is wholly unwarranted, as I will show below.

In more general terms, though, a large number of scholars within cognitive science seem to hold the belief that recursiveprocedures only generate recursive structures; or to go in the opposite direction, that recursive structures can only begenerated (or processed, in the case of real-time behaviour) recursively. This is the first conflation I aim to air out, and thenshoot down:

Conflation I: Confusing recursive mechanisms with its products.

Perhaps the clearest culprits are Corballis (2011) and Jackendoff (2011), but a few others are mentioned and discussed inLobina (2011b), a review of a book on recursion and language, and in Lobina (2011a), a critique of the manner in whichartificial grammar learning experiments have probed recursion (some of the scholars there treated will be discussed below).In Corballis (2011), we find that recursive rewriting rules are employed to define the resultant structures as recursive; a sort ofdefinitional technique, that is. This is not only unnecessary, but it’s likely to confuse rather than clarify. It is unnecessarybecause rewriting rules are not in fact needed to so define the structures Corballis has in mind. After all, Corballis alsoconsiders the possibility of recursive structures in general cognition, but with no attempt whatsoever to delineate the un-derlying systems that would generate them. Indeed, all he does is describe mental structures that are structurally similar toself-embedded expressions, and no more is in fact needed to drive his point. Moreover, in one of the early footnotes in his2011 book (ft. 7, p. 229), Corballis notes that even though he has been told that rewriting rules are ‘an old-fashioned way ofshowing the structure of sentences’, he doesn’t think that this is a problem for his exposition. Actually, rewriting rules wereemployed in order to generate sentences rather than to show their structure (even if they could be so employed), and Corballiscouldn’t carry out the same strategy with merge, as this mechanism proceeds in the same manner for both embedded andself-embedded structures alike, and yet self-embedded sentences are different from embedded sentences.22

Jackendoff (2011) offers an even clearer example of this conflation. After drawing a distinction between what he calls“formal recursion” (a set of rules that can apply to its own output) and “structural recursion”, the usual allusions to the formalsciences included, but without referencing anything at all, he tells us that evidence for the former relies on the latter (pp. 592–3). I don’t doubt the usefulness of such a distinction (even though the definition of formal recursion is partly mistaken), butthe last statement would certainly have surprised a 1950s Noam Chomsky, for the noted linguist was at the time keen to userecursive techniques in order to account for the constant, novel linguistic behaviour he thought was the normd“recursivestructures” were certainly not the evidential basis for formal recursion.

In any case, the obvious point to make here is that the existence of recursive structures in a given cognitive domain doesnot necessarily mean that they were, or indeed, that theymust be, generated recursively. This was already alluded to above inthe context of the extensional equivalence of various computational formalisms, but a more specific example will clarify.Supposedly self-embedded an bn strings can be generated by both recursive rules such as S / a(S)b and by non-recursivechains of rules like the following: A / aB, B / aC, C / aD, D / bE, E / bF and F / b.

The main problem, in any case, is that by accepting this conflation one ends up holding themistaken belief that if recursivemechanisms are the only specific part of the language faculty, then this should be reflected in the presence of self-embeddedsentences in every language of the world. This is of course the main point of contention in the literature, but it’s hard tounderstand why so many scholars hold this view. Take van der Hulst (2010), the aforementioned collection of papers on therole of recursion in language. As Lobina (2011b) has chronicled, many of the papers in that collection start by quoting HCF’srecursion hypothesis, whatever they take this hypothesis to actually involve, and thenmove on to the issue of the universalityof self-embedded sentences. However, given the way in which the actual hypothesis reads, it is hard to understand the in-terpretations it has given rise to; to quote in full:

22 Cor“languaGramm

We propose in this hypothesis that FLN [the faculty of language in the narrow sensedDJL] comprises only the corecomputational mechanisms of recursion as they appear in narrow syntax and the mappings to the interfaces (p. 1573)

Givenwhat this quote actually says, why would anyone, first of all, identify the phrase “computational mechanisms”withself-embedded sentences? And further, why would the combination of self-embedded sentences, if this is what is to beunderstood by the expression the core computational mechanisms of recursion, and interfacemappings constitute the uniqueand central part of the language faculty? Admittedly, ‘the core computational mechanisms of recursion’ is a somewhatinfelicitous choice of words, but surely HCF don’t hold the beliefs these two questions point to. Such is the state of the field,and this is perhaps clearest in the case of Dan Everett’s study of the Pirahã language (2005, 2008, 2009, 2010).

Everett’s work, however, fails to properly engage with Chomsky’s introduction and use of recursive techniques. Everett(2010), in particular, starts with the ‘important book-keeping’ matter of defining recursion, and he offers two in-terpretations, one that characterises recursion as an operation that applies over its own output, and another that is basically adefinition of a recursive set (p. 1). Ten pages later we are told that these definitions are what ‘computer scientists mean by

ballis (2011), it has to be said, contains a barrage of errors: p. 7 conflates recursive generation and embedding; on p. 23, I-language is described as age of thought”; and on ft. 12, p. 229 Corballis offers his peculiar take on the history of generative grammar: “deep structure” gave way to “Universalar”, and this in turn to I-language; and the latter, by implication, to the language of thought.

Page 10: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–70 65

recursion’, but no references are offereddperhaps unsurprisingly so. In any case, after stating that he will keep to the firstdefinition, he moves on to a study of self-embedded sentences within Pirahã, but nowhere arewe shown that there is a directconnection between an operation that applies over its own output and self-embedded sentences. Be that as it may, hisconclusion, as is well-known, is that the Pirahã language lacks all types of self-embedded structures, and therefore thisconstitutes, for him at least, a clear refutation of not only HCF’s hypothesis, but of Chomsky’s paradigm in toto (Everett, 2008,p. 229). Based onwhat was chronicled in the previous section, however, this cannot be the case, for recursionwas introducedinto linguistics as a general feature of the underlying generative procedure, and it certainly doesn’t depend upon the presenceof self-embedded sentences. Thus, I can only be in agreement with Tomalin (2011), for whom Everett’s claims are fallacious.

As a result, Everett is reduced to a rather precarious situation in order to salvage his claims. Tellingly, none of the publi-cations of his that I have cited here quote or discuss any of the sources I used in the previous section, his focus exclusively lyingon HCF instead, and all this accompanied with rather unsubstantiated claims regarding the history of recursion in linguistictheory. The actual line of his argument goes something like this in his Don’t sleep, there are snakes book. Computer scientists,linguists, psychologists, and philosophers identify recursion with a matryoshka-doll characteristic (things inside otherstructurally-equivalent things, that is; pp. 227–8). Recursionwas introduced into linguistics as away to account for the infiniteuse of finite means, even though no ‘linguist could really provide a coherent story of what that expression really means inscientific terms’ (p. 228). Even though recursion has traditionally alwaysmeant ‘the ability to put one item inside another of thesame type’ (Everett, 2008), ‘a curious thing has happened’ since his workwas published, Everett continues, as the definition ofrecursion has mutated among Chomsky’s followers, now simply meaning some ‘form of compositionality’ (p. 229).

The whole argumentative thread is clearly incorrect. The recursive algorithms of computer scientists, or the recursivefunctions of mathematicians and philosophers, don’t exhibit any sort of self-embedding, but a self-referential feature instead;that is, these algorithms and functions compute values that have been computed byearlier applications of the same algorithmsand functions, but there are no functions embedded into other functions. Also, the mathematicians of the 1930s and 40s quiteclearly showed how finite axioms could compute/generate infinite sets, be in terms of partial recursive functions, productionsystems or a Turing machine. Taking his heed from these results, Chomsky introduced recursion into linguistics in the verysense of these now-famous studies from the formal sciences; its traditional connotation in linguistics if anything is. What’smore, the definition of recursion hasn’t really changed recently, at least not in the case of Chomsky’s writings, as I have shownhere (asmentioned, Everett doesn’t actually discuss any individual publication of Chomsky’s). In the end, in his 2010 paper, anddespite his protestation as to the lack of clarity vis-à-vis recursion in HCF (p. 1), he is reduced to selectively extracting thosequotes fromHCF that can be used tomake his case (6 lengthy citations in the space of a couple of pages, as it happens), ignoringmost of the history of generative grammar, and that is a practice that doesn’t strike one as particularly good scholarship.

Be that as it may, whether Pirahã exhibits self-embedded sentences or not is an interesting topic in itself, and this issue cancertainly be approached independently of the recursive generationsmerge effects. As mentioned, treating objects themselvesas recursive is very common in computer science but unheard of in mathematical logic, but I fail to seewhy that ought to haveany import into linguistics. It is true that mathematical logic only allows recursive objects (such as recursive sets) if they arethe output of recursively-defined functions, and this practice of mathematicians seems to have blinded some, such as Collins(2008), into believing that a linguistic object can’t be described as recursive unless it is generated by a recursive function (pp.160–1). Collins discusses this very issue in the context of the arrays of asterisks that Pinker & Jackendoff have used in variouspublications to make a point about recursive data structures in domains other than language; therein, Collins argues that thecollection of asterisks can’t be regarded as recursive independently of the function that generates it, but it seems to me thatthis criticism is unpersuasive.23 Linguistics, after all, is not mathematical logic, so why should linguistic theory be so con-strained? In any case, the distinction between recursive mechanisms and recursive structures is, it seems to me, a useful one,and one that has featured in the literature extensively.24

As mentioned, according to Pinker and Jackendoff (2005) a recursive structure results when ‘a constituent.contains aconstituent of the same kind’ (p. 203), which is a more particular definition to the general one I provided supra. Naturally, it israther important to establish what the “X within an X” of my definition stands for. In the case of the definition employed byPinker and Jackendoff (2005), kind refers to the category of the element that heads a constituent. Accordingly, an XP within anXP would be a case of recursion whereas an XP inside an YP would not be. As stated earlier, however, the language facultyappears to generate recursive structures of a very general type in terms of specifier–head–complement(s) (SHC) phrases. Assuch, I venture to say, merge ‘recursively constructs’ Pirahã objects as much as it constructs the syntactic objects of any otherlanguage: they are nothing but bundles of SHC phrases.

Before moving on to the next conflation, there is a subtler version of conflation number one that is worth discussing, onethat doesn’t centre on recursive computational mechanisms, but on the very recursively-defined functions mathematicallogic intensively studied in the 1930s and 40s. One example is to be found in Arsenijevi�c and Hinzen (2010), a rather

23 Mind you, the point made in, e.g. Jackendoff and Pinker (2005, p. 218), seems a banal one. Therein, they present an array of asterisks organised incolumns and rows to make the point that for, say, a grouping of five asterisks, the ones in the middle could be interpreted as being embedded into thebigger series. There is no obvious internal hierarchy among the asterisks, however, apart from Jackendoff & Pinker telling us that we could entertain such aninterpretation.24 This paragraph supersedes the conclusion in Lobina and García-Albea (2009), where it was suggested that recursion should only be applied tocomputational mechanisms and not to objects.

Page 11: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–7066

unexpected occurrence, as one of the authors has elsewhere correctly characterisedmerge as being underlain by the successorfunction (namely, in Hinzen, 2009). In fact, Arsenijevi�c and Hinzen (2010) start by appropriately describing merge as arecursive function (p. 166), but immediately after, in a blink-and-you-miss-it sort of moment (that is to say, within brackets),they further state that this is reflected in the observation that linguistic expressions may exhibit structure inwhich a categorybecomes a part of a bigger category of the same kind.

Crucially, however, recursive functions and self-embedded sentences are not comparable in nature or structure. The latter,but not the former, exhibits a containment relation between two elements of the same category; recursive functions, on theother hand, are defined with a two-equation system in which the recursive step computes values in terms of previouslydefined values, but there is no self-embedding of any kind. The self-calls merely point to the values a simpliciter functioncalculates, but there is no hierarchy of functions (think of the factorial of 4, which becomes 4 � (factorial 3), and then thefactorial of 3 turns into 3 � (factorial 2), and so on until reaching the base case).

Similarly, MacWhinney (2009), right after stating that he will identify recursion with an inductive analysisdreferencingTomalin (2007) and Bar-Hillel (1953)dprovides examples for the ‘syntactic effects of recursion’, namely relative clauses (p.406), apparently oblivious to the fact that self-embedded sentences and inductive definitions bear no relation whatsoever.25

This is the second type of conflation I will list here, a rather puzzling one, I must admit.

Conflation II: Conflating recursively-defined functions and self-embedded sentences.

If scholars have confused the self-reference so typical of recursively-defined functions with self-embedded sentences,others couldn’t perhaps be faulted for conflating the recursiveness of some rewriting rules and self-embedding operations.This blending, at least, doesn’t go from a property of a computational mechanism to a property of structures; it simply fusestwo different computational operations into one. Fitch (2010) is a case in point. After announcing that he will be discussingkey distinctions in the understanding of recursion for the biolinguistic enterprise, he proceeds to describe how this term isunderstood (or ought to be understood) in computer science, metamathematics, and linguistics. I will focus on Fitch’streatment of recursion within linguistics only, for his discussion there bears no relation to what he has to say about the placeof recursion in computer science and metamathematics. Indeed, after offering a more or less adequate outline of recursion inthe formal sciences, Fitch stipulates, first of all, that in linguistics a recursive rewriting rule has the property of self-embedding(p. 78); then, he claims, it has been a long-standing ‘linguistic stipulation’ that a self-embedding rule generates a self-embedded structure (p. 80). As a consequence, Fitch equates recursion with a self-embedding operation, a connotation hedoes not connect with what he has outlined before in the formal sciences.

As a matter of fact, it is erroneous to equate a recursive rewriting rule with a self-embedding operation, for a rewriting ruleis recursive on account of the self-call it effectsdit knows nothing about embedding (or self-embedding) structures into otherstructures. As a result, Fitch (2010) is in actual fact interested in the study of self-embedded structures, as evidenced in hisbelief that constructing the right interpretation of a self-embedded structure constitutes an “empirical indicator” forrecursion (pp. 80–81), wilfully unaware, it seems, that a self-embedded structure bears no relation to a definition by recursion(i.e., to the property that makes merge a recursive generator). This is clearly an unfortunate lapse into semantics for a notionthat is intrinsically syntactic; that is, a notion that only refers to the nature of the generative or processing mechanism,independently of the interpretation the resultant structure receives. More importantly, we are owed an explanation as to whythis could tell us anything about recursive generation (competence) or processing (performance), as opposed to the nature ofself-embedded structures only. Let us state conflation number three thus:

Conflation III: Conflating recursive rules and self-embedding operations.

There is somethingmore at stake here, however. It is certainly true that the replacement of production systemswithmergeinvolves the further postulation of an operation that ‘embeds (an object) within some construction.already formed’(Chomsky, 1995b, p. 248). However, the last point should not be confused with the aforementioned definition of merge as aprocedure that recursively constructs syntactic objects. That is, embedding objects into other objects is an aspect of whatmerge does, the recursive generation that takes place at each stage of a derivation being another. To complicate things further,there is also the question of how preciselymerge proceeds; how one stage of a derivation is followed by another is yet anotheraspect of computational systems. Consequently, it is important to stress that recursion and (self-)embedding are two differentthings, as (for that matter) are (self-)embedded structures and (self-)embedding operations. To believe otherwise is toconfuse, among other things, what an operation does with how it proceeds. This is the last and perhaps more widespread ofthe conflations I am calling out.

25 It’s not clear how the sources this scholar cites connect with an inductive analysis, though. I have already discussed Tomalin’s work supra; as for Bar-Hillel (1953), this paper suggests that the social sciences might benefit from employing recursive definitions. Presumably, Bar-Hillel was interested in amore precise definitional technique for theoretical constructsdor as he put it, recursive definitions are ‘a perfectly legitimate means of concept formation’(p. 162)d, but this employment of recursion does not relate to either the recursive specification of generative procedures or recursive rewriting rules.Interestingly, Chomsky (1955, p. 45) manifests his agreement in spirit, while two years later sees ‘success along these lines unlikely’ (Chomsky, 1957, p.58)din any case, Chomsky (1995b, p. 243) provides a recursive definition of syntactic objects.

Page 12: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–70 67

Conflation IV: Confusing what an operation does with how it applies.

The main problemwith this conflation is that the literature seems to be confusing a recursive step in a computation withan operation that embeds elements into other elements. As far as computer scientists are concerned, recursion is a property ofhow an operation proceeds, and not of what an operation does; indeed, a great number of computations can be specifiedrecursivelydthe factorials, the Fibonacci series, etc.dbut they all exhibit different combinatorial operations.26

Unfortunately, the conflation between these two different things is rather common in the literature. Many examples canbe found in van der Hulst (2010), some of which I will discuss in what follows. I will only provide the full reference of thecontributions I treat more extensively, otherwise I will just state the last name of the authors (for a review of the overallcollection, see Lobina, 2011a).

Some of the contributions to this book (namely, those of Karlsson, Verhagen, Kinsella, Harder, Hunyadi) discuss variousconstructs and it is necessary to clarify what they stand for; these include centre-embedding rules, tail-recursive rules, thesort of structures these generate, their relationship, etc. A centre embedding rule is supposed to generate nested structures inwhich a sentence is embedded in the middle of a bigger sentence, like those which were called self-embedded expressionssupra: [the mouse [the cat [the dog bit] chased] ran away]. A tail-recursive rule, on the other hand, embeds elements at the edgeof sentences, either on the left-hand side (John’s [brother’s [teacher’s book]]) or on the right-hand side (the man [that wrote thebook [that Pat read in the cafe [that Mary owns]]]). These terms, however, while constituting a redundant excess in terminology,have absolutely nothing to do with the recursive character of the rules themselves, only to the type of embedding theresultant expression manifests.27

A centre-embedding rule, after all, is not a rule inwhich the reflexive call occurs, literally, in themiddle of a derivationdnolinguistic theory actually derives centre-embedded sentences in such a manner. The employment of the term tail-recursive isperhaps more unfortunate, as the self-call of these processes, recall, is not an instance of a (self-)embedding operation; therecursive call of tail-recursive algorithms is one that takes place at the very end of a computation, but what operation exactlyit carries out depends upon the precise computation that is being implemented. In any case, a nested structure on the left-hand side of a sentence cannot be the result of a tail-recursive rule as these scholars understand it if the derivation processundergoes the left-to-right applications of rewriting rules, and this is evenmore true in the case of the bottom-up derivationsmerge carries out. In a nutshell, these terms refer to specific properties of structures, not of operations. If anything, thesescholars are guilty of supposing that the structure of a computational process manifests itself in a transparent manner in theobject so constructed, but that is a rather puzzling belief to hold.

A particular case of this conflation is to be found in Parker (2006) and Kinsella (2009).28 Curiously, this author definesrecursion in terms of what sort of operation it carries out, whilst iteration is defined instead in terms of how it proceeds,thereby disentangling these recurrent operations into two very different classes. As such, iteration involves, Kinsella tells us,repeating an action an arbitrary number of times (Kinsella, 2009, pp. 115–9), while recursion implicates embedding an objectwithin another instance of itself (Kinsella, 2009). She claims to derive these definitions from the computer science literature(Kinsella, 2009), but once more not a single reference is provided. In the end, it is quite clear that she is focused on recursivestructures and not on computational mechanisms, as she quite explicitly states that recursion ‘inherently involves semantics’(p. 127), this being exemplified in two constructions only: possessives and subordinate clauses (p. 150). As a result, Kinsellahas been unable to identify the recursive quality ofmerge, which she confusingly denies by stating thatmerge is a procedure,while recursion is a characteristic (ft. 20, p. 129).

Going back to vanderHulst (2010), someof its contributors seem tohave amuch stronger claim inmind. Karlsson, followingParker (2006), contends that “nested recursion” rules (what others call centre-embedding rules) cannot be reduced to iter-ations (while tail-recursion supposedly can), a claim that is repeated byHarder (p. 239) and, with qualifications, in Zimmerer &Varley’s contribution (p. 397). They could not possibly mean this as a general point about computability theory, however. Infact, one of the references we used above is indirectly mentioned in van der Hulst (2010, p. 347)dnamely, Liu and Stoller(1999)dwhich, it will be recalled, offers a framework that provides automatic transformations of any type of recursion intoiteration, an “optimization technique” that can cope with the most complex of recursive relations, such as multiple base casesor multiple recursive steps, of which Fibonacci sequences are an example (contrary to what Fitch, 2010, p. 78 seems to think).

If this point does not hold for mechanisms, one may still wonder if it holds for structures.29 Some of the papers justmentioned seem to believe that self-embedded sentences cannot be converted into other types of phrases, but this is explicitlydenied in Kinsella’s contribution (p.188). As shemakes clear, even if languages like Pirahãwere shown to fully dowithout self-

26 SICP offer recursive and iterative computations for both the factorial class (pp. 33–4) and Fibonacci sequences (pp. 39 et seq.).27 These definitions are redundant because these structures had already been defined long ago, and in clearer terms to boot. Chomsky (1965) drew adistinction between nested constructionsdnamely, ‘phrases A and B form a nested construction if A falls totally within B, with some nonnull element to itsleft within B and some nonnull element to its right within B’ (p. 12)dand self-embedded structuresdthat is, a ‘phrase A is self-embedded in B if A is nestedin B and, furthermore, A is a phrase of the same type as B’ (Chomsky, 1965)d, whilst Chomsky and Miller (1963) discuss left- and right-recursive structures.If anything, these definitions demonstrate that structures have been defined as recursive from the very beginning of generative grammar, and unpro-blematically so, I may add.28 Both papers belong to the same author; a version of the former appears under the author’s married namedKinselladin van der Hulst (2010).29 In fact, the claim that recursion cannot be reduced to iteration, whilst typically framed in terms of computational mechanisms, is usually actuallyexemplified in terms of structures, a very different claim indeed; as a case in point, see Uriagereka’s contribution in Piattelli-Palmarini et al. (2009).

Page 13: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–7068

embedded sentences, thiswould not translate into an expressive loss to their speakers. That is, there is no reason to believe thatPirahã cannot ‘express [similar] concepts using alternative means’ (Kinsella, 2009). Indeed, a self-embedded sentence such asthe mouse the cat the dog chased bit ran away seems to be easily convertible into either the dog chased the cat that bit the mousethat ran away (which I would call a right-branching structure) or the dog chased the cat and the cat bit the mouse and the mouseran away (conjunction). This is an old point, actually. Langendoen (1975, p.199)mentions that English extraposition allows theconversion of centre-embedded structures into right-branching ones; and of left- and right-branching sentences into coor-dination. This aspect of the discussion is perhaps the clearest sign that there is a conflation at heart: even though most of theclaims here treated are framed in terms of computational operations, they actually only refer to the structural features ofsentencesdwhat’s more, this diagnostic applies, mutatis mutandis, to all the other conflations.

I should end this section by briefly discussing two further issues. One of these was alluded to earlier: some scholars talk ofrecursive generation very loosely as a process that applies over its own output (e.g., Boeckx, 2009; Hornstein, 2009; Everett,2009). That a recursive process involves an operation that applies over its own output is of course trivially true, but it does notdifferentiate it from iteration, both processes being cases of recurrencesdthus, both processes apply over their own outputs.What distinguishes one from the other is that a recursive process involves a self-call; thus, a recursive operation applies over apart of its own output, the rest typically being operations and structures that are kept in memory (in a stack) as the internal,recursive applications proceed to resolve each sub-task. This is rather clear in the recursive computation of the factorials; thevery first output of the computation of the factorial of 4 is (4� factorial 3), and the next operation takes only (factorial 3) as itsinput, not the entire output (the bit (4 �) is kept in memory, to be completed later on).

The other issue has to do with whether syntactic derivations proceed recursively in the technical sense of computerscience. I don’t know of any evidence suggesting that linguistic derivations proceed recursively in this sense, though. It seemsto me that there are no chains of deferred operations in syntactic derivations, and thatmerge recursively constructs syntacticobjects successively instead. There is no contradiction here, but the point is likely to confuse and confound. Both Bickerton(2009) and Luuk and Luuk (2011), for instance, can’t recognise the recursive nature of linguistic generation because allthey see is thatmerge reapplies its embedding operation in an iterative manner; what they fail to realise is that every stage ofa derivation, much as every stage of a production system, is postulated to be recursively defined. This is exactly the same pointI made earlier regarding the iterative conception of set in relation to Forster’s confusion as to why it is called the iterativerather than the recursive conception. If so, then, a syntactic derivation is an iterative succession of recursive generations. As amatter of fact, Chomsky (2004) talks ofmerge being both a recursive system and iterable (p.108), which certainly points to thedistinction I am drawing in this paragraph, but perhaps not as explicitly as I am drawing it. Let us then end this section hereand restate, reorganise, and rephrase the main points to bring home in the conclusion.30

5. What recursion could not be

As stated above, Chomsky introduced recursion into linguistics in order to account for discrete infinity, a central propertyof language and of what he calls the creative use of language; the latter, to wit, refers to the ability to understand and producean unbounded number of sentences in appropriate circumstances and relatively free of external stimuli (see, for instance,Chomsky, 2006). This unbounded number of sentences need not be infinite in any precise sense; what the phrase is meant toexemplify, in my opinion, is simply the observation that the number of sentences we can produce and understand goes waybeyond the finitememorywe possess, and that therefore a generative proceduremust be provided if we are to account for ourlinguistic knowledge. For this very purpose, Chomskymade use of the tools that were at hand at the momentdnamely, thoseof recursion function theoryd, given that mathematicians had managed to formally explain how infinite systems could bemodelled in finite means. Even though Chomsky initially adopted Post’s production systems, his take on what recursionactually consists of appears to go beyond this particular formalism, taking this property to be central towhat a computation ingeneral is. As Soare (1996) has chronicled, this was certainly the spirit of the times, perhaps still extant in some quarterstoday; whether the beliefs and practices surrounding Soare’s Recursion Convention are well-founded, and whether, deriv-atively, it is still proper to centre on recursion in our theories of language, I have not and will not discuss here. What I can say,what I have said, with most certainty is what recursion is supposed to stand for within the theory Chomsky has con-structeddwith remarkable consistency, I should add.

The recursive property of language, such as Chomsky has understood it, is a central feature of both production systems anda set-operator like merge; it is the definition by induction that justifies every stage these generative procedures produce,independently ofwhat operation exactly a given formalism effects. Production systems are strictly speaking string-substitutionsystems (structure being subsequently associated to the strings), while merge has been postulated to combine two syntacticobjects into a compound, and then carrying on by merging syntactic objects into an ever more complex unit. There are,therefore, two different aspects, or perspectives if you wish, to what these generative procedures do; on the one hand, theyrecursively generate objects (strings or sets), while on the other, they effect a precise combinatorial operation (string-sub-stitution in one case, embedding in the other). There is of course still the issue of how these procedures proceed, that is, of how

30 These last paragraphs supersede the material in Lobina and García-Albea (2009), where merge is correctly described as proceeding iteratively but therecursive generation it carries out at every stage is missed. Everett (2012) uses this source to argue that a syntactic derivation need not proceed recursivelyat all, but his point is undercut by the distinction between an aspect of what merge does (in this case, recursive generation) and how it proceeds.

Page 14: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–70 69

the computational processes these procedures generate shape updthe structure of a derivation as opposed to the structure ofan object. Given that both production systems and merge apply one single operation in successionda recurrencedthe deri-vationmay be recursive or iterative, in the technical sense inwhich these two are understoodwithin computer science. It is notclear, however, that there are any grounds for believing that a linguistic derivation proceeds recursively in this sense.

Be that as it may, the literature here surveyed has been found to be rather wanting. In particular, most of the scholars I havediscussed have been unable or unwilling to track the history and application of recursion in linguistics, at least as exemplifiedin Chomsky’s writings, and it is hard to understand this aspect of the literature; Chomsky has been rather clear, and innumerous publications to boot. As a result, most studies have focused on the nature of self-embedded sentences (or on thenature of self-embedding operations), and onwhether this type of sentences is present in all languages of the world. As I havetried to show, those issues, whilst important and interesting in themselves, have little to dowith recursion as this concept hasbeen understood within mathematical logic, the connotation Chomsky has always employed, it seems. As such, there is a bitof a disconnect between what Chomsky is talking about when talking about recursion and what other linguists are talkingabout when talking about recursion. As a result, rather incompatible claims are being put forward in the literature; indeed, todefend that what is universal in linguistics is the recursively-defined operations of a computational system (clearly Chomsky’stake) is very different from the claim that all languagesmust exhibit self-embedded sentences if recursion is to be accepted assuch a central concept (Everett is probably the most prominent example of this line of thought). In any case, by devoting somuch space to the listing and exposition of the number of conflations one can find in the literature, I have been able to bemore specific regarding the different aspects of generative procedures (what they are, what they do, how they proceed, theirindependence from the structures they generate, etc.), and that could be of some use to the field.

In order to end this paper, I should say that it is notmy intention to offer any advice or suggestions; conceptual analysis andnot semantic hygiene was my aim here. In particular, I have wanted to find out what the presumed centrality and importanceof recursion actually amounts to, not tomention its hypothesised uniqueness and universality. Sowhat does it all mean, then?To repeat and recap. Recursion is a property of the computational system underlying the faculty of language, a feature of thetheory linguists construct and thereby part of any human language insofar as a generative procedure is to be prescribed forwhat has been called knowledge of language, the subject matter of linguistics. In addition and independently, this systemgenerates recursive structures of a very general kinddbundles of specifier–head–complement(s) phrasesda geometry sobasic and principal that no language could possibly dowithout. The recursive property of the generative procedure appears tobe a necessity, whereas the general recursive structure of linguistic expressions a contingency; nevertheless, both propertiesare part of the language faculty in toto, and derivatively of every natural languagednot the other way around.31

References

Abelson, H., Sussman, G.J., 1996. Structure and Interpretation of Computer Programs. The MIT Press, Cambridge, MA (With J. Sussman).Arsenijevi�c, B., Hinzen, W., 2010. Recursion as a human universal and as a primitive. Biolinguistics 4 (2–3), 165–173.Bar-Hillel, Y., 1953. On recursive definitions in empirical science. In: Proceedings of the 11th International Congress of Philosophy in Brussels, vol. 5. North-

Holland Publishing Co, Amsterdam, The Netherlands, pp. 160–165.Benacerraf, P., 1965. What numbers could not be. Philos. Rev. 74 (1), 47–73.Bickerton, D., 2009. Recursion: core of complexity or artifact of analysis? In: Givón, T., Shibatani, M. (Eds.), Syntactic Complexity: Diachrony, Acquisition,

Neuro-cognition, Evolution. John Benjamins, pp. 531–544.Boeckx, C., 2009. Language in Cognition. Wiley-Blackwell, Oxford, England.Boolos, G., 1971. The iterative conception of set. J. Philos. 68 (8), 215–231.Brainerd, W.S., Landweber, L.H., 1974. Theory of Computation. John Wiley and Sons, Inc, New York, New York.Chomsky, N., 1955. Logical syntax and semantics: their linguistic relevance. Language 31 (1), 36–45.Chomsky, N., 1956. Three models for the description of language. In: IRE Transactions of Information Theory IT-2, pp. 113–124.Chomsky, N., 1957. Syntactic Structures. Mouton Publishers, The Hague, The Netherlands.Chomsky, N., 1963. Formal properties of grammars. In: Luce, R.D., Bush, R.R., Galanter, E. (Eds.), Handbook of Mathematical Psychology. John Wiley and sons,

Inc, pp. 323–418.Chomsky, N., 1965. Aspects of the Theory of Syntax. The MIT Press, Cambridge, MA.Chomsky, N., 1975. The Logical Structure of Linguistic Theory. Plenum Press, New York, New York.Chomsky, N., 1980. Rules and Representations. Columbia University Press, New York, New York.Chomsky, N., 1981. Lectures on Government and Binding. De Gruyter Mouton, The Hague, The Netherlands.Chomsky, N., 1986. Knowledge of Language. Praeger Press, New York, New York.Chomsky, N., 1995a. Bare phrase structure. In: Webelhuth, G. (Ed.), Government and Binding Theory and the Minimalist Program. Blackwell Publishers, pp.

381–439.Chomsky, N., 1995b. The Minimalist Program. The MIT Press, Cambridge, MA.Chomsky, N., 2000. Minimalist inquiries: the framework. In: Martin, R., Michaels, D., Uriagereka, J. (Eds.), Step by Step. The MIT Press, pp. 89–156.Chomsky, N., 2004. Beyond explanatory adequacy. In: Belleti, A. (Ed.), Structures and Beyond. Oxford University Press, pp. 104–131.Chomsky, N., 2006. Language and Mind. Cambridge University Press, Cambridge, England.Chomsky, N., 2008. On phases. In: Freidin, R., Otero, C.P., Zubizarreta, M.L. (Eds.), Foundational Issues in Linguistic Theory. The MIT Press, pp. 133–166.Chomsky, N., 2012. The Science of Language: Interviews with James McGilvray. Cambridge University Press, Cambridge, England.Chomsky, N., Miller, G.A., 1963. Introduction to the formal analysis of natural languages. In: Luce, R.D., Bush, R.R., Galanter, E. (Eds.), Handbook of Math-

ematical Psychology, vol. 2. John Wiley and Sons, Inc, pp. 269–322.Church, A., 1936. An unsolvable problem of elementary number theory. In: Davis, M. (Ed.), The Undecidable. Dover Publications, Inc, pp. 88–107.

31 There are some orbiting issues that also deserve an airing, but that will have to await another paper. Some of these include whether a recursively-specified generative procedure must be postulated for other domains of human cognition (or indeed for the cognition of other species) and whetherother systems of the mind employ recursive structures that are isomorphic to the general recursive character of linguistic expressions (or indeed whetherthe representations of other species exhibit similar recursive structures to those of either human language or human general cognition).

Page 15: What linguists are talking about when talking about…

D.J. Lobina / Language Sciences 45 (2014) 56–7070

Collins, J., 2008. Chomsky: a Guide for the Perplexed. Continuum International Publishing Group Ltd, London, UK.Corballis, M., 2011. The Recursive Mind. Princeton University Press, Princeton, New Jersey.Corcoran, J., 1972. Review of John Lyons Noam Chomsky. Word 28 (3), 334–338.Cutland, N., 1980. Computability: an Introduction to Recursion Function Theory. Cambridge University Press, Cambridge, England.Davis, M.E., 1965. The Undecidable. Dover Publications, Inc, Mineola, New York.Everett, D.L., 2005. Cultural constraints on grammar and cognition in Pirahã. Curr. Anthropol. 46 (4), 621–646.Everett, D.L., 2008. Don’t Sleep, There are Snakes. Profile Books, London, England.Everett, D.L., 2009. Pirahã culture and grammar: a reply to some criticisms. Language 85 (2), 405–442.Everett, D.L., 2010. You Drink. You Drive. You Go to Jail. Where’s Recursion? lingBuzz/001141.Everett, D.L., 2012. What does Pirahã grammar have to teach us about human language and the mind? Wiley Interdiscip. Rev. Cogn. Sci. 3 (6), 555–563.Fitch, W.T., 2010. Three meanings of recursion: key distinctions for biolinguistics. In: Larson, R., Déprez, V., Yamakido, H. (Eds.), The Evolution of Human

Language. Cambridge University Press, pp. 73–90.Forster, T., 2008. The iterative conception of set. Rev. Symb. Log. 1 (1), 97–110.Frank, R., 2004. Restricting grammatical complexity. Cogn. Sci. 28, 669–697.Gersting, J.L., 1982. Mathematical Structures for Computer Science. W. H. Freeman & Company, New York, New York.Gödel, K., 1931. On formally undecidable propositions of the Principia mathematica and related systems, I. In: Davis, M. (Ed.), The Undecidable. Dover

Publications, Inc, pp. 4–38.Hauser, M.D., 2009. Origin of the mind. Sci. Am. 301 (3), 44–51.Hauser, M.D., Chomsky, N., Fitch, W.T., 2002. The faculty of language: what is it, who has it, and how did it evolve? Science 298, 1569–1579.Hinzen, W., 2009. The successor function þ LEX ¼ human language? In: Grohmann, K.K. (Ed.), InterPhases. Oxford University Press, pp. 25–47.Hornstein, N., 2009. A Theory of Syntax. Cambridge University Press, Cambridge, England.Jackendoff, R., 2011. What is the human language faculty? Two views. Language 87 (3), 586–624.Jackendoff, R., Pinker, S., 2005. The nature of the language faculty and its implications for evolution of language. Cognition 97, 211–225.Kinsella, A.R., 2009. Language Evolution and Syntactic Theory. Cambridge University Press, Cambridge, England.Kleene, S.C., 1938. On notation for ordinal numbers. J. Symb. Log. 3 (4), 150–155.Kleene, S.C., 1952. Introduction to Metamathematics. North-Holland Publishing Co, Amsterdam.Langendoen, T., 1975. The relation of competence to performance. Ann. N. Y. Acad. Sci. 263, 197–200.Langendoen, T., 2010. Just how big are natural languages? In: van der Hulst, H. (Ed.), Recursion and Human Language. De Gruyter Mouton, pp. 139–147.Langendoen, T., Postal, P.M., 1984. The Vastness of Natural Languages. Basil Blackwell Ltd, Oxford, England.Liu, Y.A., Stoller, S.D., 1999. From recursion and iteration: what are the optimizations? SIGPLAN Not. 34 (11), 73–82.Lobina, D.J., 2011a. Recursion and the competence/performance distinction in AGL tasks. Lang. Cogn. Process. 26 (10), 1563–1586.Lobina, D.J., 2011b. “A running back”; and forth: a review of recursion and human language. Biolinguistics 5 (1–2), 151–169.Lobina, D.J., García-Albea, J.E., 2009. Recursion and cognitive science: data structures and mechanisms. In: Taatgen, N.A., van Rijn, H. (Eds.), Proceedings of

the 31th Annual Conference of the Cognitive Science Society, pp. 1347–1352.Luuk, E., Luuk, H., 2011. The redundancy of recursion and infinity for natural language. Cogn. Process. 12 (1), 1–11.MacWhinney, B., 2009. The emergence of linguistic complexity. In: Givón, T., Shibatani, M. (Eds.), Syntactic Complexity. John Benjamins Publishing

Company, pp. 406–432.McCarthy, J., 1963. A basis for a mathematical theory of computation. In: Braffort, P., Hirshberg, D. (Eds.), Computer Programming and Formal Systems.

North-Holland Publishing Co, pp. 33–70.Miller, G.A., Chomsky, N., 1963. Finitary models of language users. In: Luce, R.D., Bush, R.R., Galanter, E. (Eds.), Handbook of Mathematical Psychology, vol. 2.

John Wiley and Sons, Inc, pp. 419–492.Moro, A., 2008. The Boundaries of Babel. The MIT Press, Cambridge, MA.Moschovakis, Y.N., 2001. What is an algorithm? In: Engquist, B., Schmid, W. (Eds.), Mathematics Unlimited: 2001 and Beyond. Springer, pp. 919–936.Moschovakis, Y.N., Paschalis, V., 2008. Elementary algorithms and their implementations. In: Cooper, S.B., Lowe, B., Sorbi, A. (Eds.), New Computational

Paradigms. Springer, pp. 81–118.Mukherji, N., 2010. The Primacy of Grammar. The MIT Press, Cambridge, Massachusetts.Nevins, A., Pesetsky, D., Rodrigues, C., 2007. Pirahã exceptionality: a reassessment. Language 85 (2), 355–404.Parker, A.R., 2006. Evolving the narrow language faculty: was recursion the pivotal step? In: Cangelosi, A., Smith, A.D.M., Smith, K. (Eds.), The Evolution of

Language: Proceedings of the Sixth International Conference on the Evolution of Language.Piattelli-Palmarini, M., 1980. Language and Learning: the Debate Between Jean Piaget and Noam Chomsky. Routledge and Kegan Paul, London, England.Piattelli-Palmarini, M., Uriagereka, J., Salaburu, P., 2009. Of Minds and Language: a Dialogue with Noam Chomsky in the Basque Country. Oxford University

Press, Oxford, England.Pinker, S., Jackendoff, R., 2005. The faculty of language: what’s special about it? Cognition 95, 201–236.Post, E., 1921. Introduction to a general theory of elementary propositions. Am. J. Math. 43 (3), 163–185.Post, E., 1943. Formal reductions of the general combinatorial decision problem. Am. J. Math. 65 (2), 197–215.Post, E., 1944. Recursively enumerable sets of positive integers and their decision problems. In: Davis, M. (Ed.), The Undecidable. Dover Publications, Inc, pp.

304–337.Pullum, G.K., 2007. The evolution of model-theoretic frameworks in linguistics. In: Rogers, J., Kepser, S. (Eds.), Model-theoretic Syntax at 10. Ireland, Dublin,

pp. 1–10.Pullum, G.K., 2011. On the mathematical foundations of syntactic structures. J. Log. Lang. Inf. 20 (3), 277–296.Pullum, G.K., Scholz, B.C., 2010. Recursion and the infinitude claim. In: van der Hulst, H. (Ed.), Recursion and Human Language. De Gruyter Mouton, The

Netherlands, pp. 113–138.Rice, G., 1965. Recursion and iteration. Commun. ACM 8 (2), 114–115.Roberts, E., 2006. Thinking Recursively with Java. John Wiley and Sons, Inc, Hoboken, NJ.Rodgers, P., Black, P.E., 2004. Recursive data structure. In: Pieterse, V., Black, P.E. (Eds.), Dictionary of Algorithms and Data Structures. Online at: http://www.

nist.gov/dads/HTML/recursivstrc.html.Sieg, W., 1997. Step by recursive step: church’s analysis of effective calculability. Bull. Symb. Log. 3 (2), 154–180.Soare, R., 1996. Computability and recursion. Bull. Symb. Log. 2 (3), 284–321.Soare, R., 2009. Turing oracles machines, online computing, and three displacements in computability theory. Ann. Pure Appl. Log. 160, 368–399.Tiede, H.-J., Stout, L.N., 2010. Recursion, infinity and modelling. In: van der Hulst, H. (Ed.), Recursion and Human Language. De Gruyter Mouton, pp. 147–158.Tomalin, M., 2006. Linguistics and the Formal Sciences. Cambridge University Press, Cambridge, England.Tomalin, M., 2007. Reconsidering recursion in syntactic theory. Lingua 117, 1784–1800.Tomalin, M., 2011. Syntactic structures and recursive devices: a legacy of imprecision. J. Log. Lang. Inf. 20 (3), 297–315.Turing, A.M., 1936. On computable numbers, with an application to the Entscheidungsproblem. In: Davis, M. (Ed.), The Undecidable. Dover Publications, Inc,

pp. 115–153.van der Hulst, H., 2010. Recursion and Human Language. De Gruyter Mouton, Berlin, Germany.Watumull, J., Hauser, M.D., Roberts, I.G., Hornstein, N., 2014. On recursion. Front. Psychol. 4, 1–7.Wirth, N., 1986. Algorithms and Data Structures. Prentice Hall Publishers, USA.