CS 321 Programming Languages and Compilers VI. Parsing

Preview:

Citation preview

CS 321Programming Languages and

Compilers

CS 321Programming Languages and

Compilers

VI. Parsing

ParsingParsingParsingParsing22

ParsingParsing

Calculate grammatical structure of program, like diagramming sentences, where:

Tokens = “words”

Programs = “sentences”

For further information, read: Aho, Sethi, Ullman, “Compilers: Principles, Techniques, and Tools” (a.k.a, the “Dragon Book”)

ParsingParsingParsingParsing33

Outline of coverageOutline of coverage

• Context-free grammars

• Parsing– Tabular Parsing Methods

– One pass

» Top-down

» Bottom-up

• Yacc

ParsingParsingParsingParsing44

What parser does:Extracts grammatical structure of program

What parser does:Extracts grammatical structure of program

function-def

name arguments stmt-list

mainstmt

expression

operatorexpression expression

variable string

cout

<<

“hello, world\n”

ParsingParsingParsingParsing55

Context-free languagesContext-free languages

Grammatical structure defined by context-free grammar. statement labeled-statement

| expression-statement | compound-statementlabeled-statement ident : statement | case constant-expression : statementcompound-statement { declaration-list statement-list }

terminalnon-terminal

“Context-free” = only one non-terminal in left-part.

ParsingParsingParsingParsing66

Parse treesParse trees

Parse tree = tree labeled with grammar symbols, such that:

• If node is labeled A, and its children are labeled x1...xn, then there is a productionA x1...xn

• “Parse tree from A” = root labeled with A

• “Complete parse tree” = all leaves labeled with tokens

ParsingParsingParsingParsing77

Parse trees and sentencesParse trees and sentences

Frontier of tree = labels on leaves (in left-to-right order)

Frontier of tree from S is a sentential form.

Frontier of a complete tree from S is a sentence.

L

E

a

L

; E

“Frontier”

ParsingParsingParsingParsing88

ExampleExample

G: L L ; E | E E a | b

Syntax trees from start symbol (L):

Sentential forms:

a a;E a;b;b

L

E

a

L

E

a

L

; E L

E

a

L

; E

b

L

E

b

;

ParsingParsingParsingParsing99

DerivationsDerivations

Alternate definition of sentence:

• Given , in V*, say is a derivation step if ’’’ and = ’’’ , where A is a production

• is a sentential form iff there exists a derivation (sequence of derivation steps) S( alternatively, we say that S)

Two definitions are equivalent, but note that there are many derivations corresponding to each parse tree.

ParsingParsingParsingParsing1010

Another exampleAnother example

H: L E ; L | E E a | b

L

E

a

L

E

a

L

;E L

E

a

L

;E

b

L

E

b

;

ParsingParsingParsingParsing1111

AmbiguityAmbiguity

• For some purposes, it is important to know whether a sentence can have more than one parse tree.

• A grammar is ambiguous if there is a sentence with more than one parse tree.

• Example: E E+E | E*E | id

E

E

E

E

E

id id

id+

E

E

EE

Eid

id id

+

*

*

ParsingParsingParsingParsing1212

AmbiguityAmbiguity

• Ambiguity is a function of the grammar rather than the language. Certain unambiguous grammars may have equivalent ambiguous ones.

ParsingParsingParsingParsing1313

Grammar TransformationsGrammar Transformations

• Grammars can be transformed without affecting the language generated.

• Three transformations are discussed next:– Eliminating Ambiguity

– Eliminating Left Recursion (i.e.productions of the form AA )– Left Factoring

ParsingParsingParsingParsing1414

Grammar Transformation1. Eliminating Ambiguity

Grammar Transformation1. Eliminating Ambiguity• Sometimes an ambiguous grammar can be

rewritten to eliminate ambiguity.

• For example, expressions involving additions and products can be written as follows:

E E+T | T

T T*id | id

• The language generated by this grammar is the same as that generated by the grammar on tranparency 11. Both generate id(+id|*id)*

• However, this grammar is not ambiguous.

ParsingParsingParsingParsing1515

Grammar Transformation1. Eliminating Ambiguity (Cont.)

Grammar Transformation1. Eliminating Ambiguity (Cont.)• One advantage of this grammar is that it

represents the precedence between operators. In the parsing tree, products appear nested within additions

E

T

TE

id

+

*

idT

id

ParsingParsingParsingParsing1616

Grammar Transformation1. Eliminating Ambiguity (Cont.)

Grammar Transformation1. Eliminating Ambiguity (Cont.)• The most famous example of ambiguity in a

programming language is the dangling else.

• Consider

S if then S else S | if then S |

ParsingParsingParsingParsing1717

Grammar Transformation1. Eliminating Ambiguity (Cont.)

Grammar Transformation1. Eliminating Ambiguity (Cont.)• When there are two nested ifs and only one else..

S

ifif then S else S

if then S

S

ifif then S

ifif then S else S

ParsingParsingParsingParsing1818

Grammar Transformation1. Eliminating Ambiguity (Cont.)

Grammar Transformation1. Eliminating Ambiguity (Cont.)• In most languages (including C++ and Java), each

else is assumed to belong to the nearest if that is not already matched by an else. This association is expressed in the following (unambiguous) grammar:

S Matched

| Unmatched

Matched if then Matched else Matched

| Unmatched if then S

|if then Matched else Unmatched

ParsingParsingParsingParsing1919

Grammar Transformation1. Eliminating Ambiguity (Cont.)

Grammar Transformation1. Eliminating Ambiguity (Cont.)• Ambiguity is a function of the grammar

• It is undecidable whether a context free grammar is ambiguous.

• The proof is done by reduction to Post’s correspondence problem.

• Although there is no general algorithm, it is possible to isolate certain constructs in productions which lead to ambiguous grammars.

ParsingParsingParsingParsing2020

Grammar Transformation1. Eliminating Ambiguity (Cont.)

Grammar Transformation1. Eliminating Ambiguity (Cont.)• For example, a grammar containg the production AAA |

would be ambiguous, because the substring has two parses.

A

A A

A

A

A A

A

A

A

• This ambiguity disappears if we use the productions AAB | B and B • or the productions ABA | B and B.

ParsingParsingParsingParsing2121

Grammar Transformation1. Eliminating Ambiguity (Cont.)

Grammar Transformation1. Eliminating Ambiguity (Cont.)

• Other three examples of ambiguous productions are:– AAA

– AA | A and

– AA | AA

• A language generated by an ambiguous Context Free Grammar is inherently ambiguous if it has no unambiguous Context Free Grammar. (This can be proven formally)

– An example of such a language is L={aibjcm | i=j or j=m} which can be generated by the grammar:

SAB | DC

AaA | CcC | BbBc | DaDb |

ParsingParsingParsingParsing2222

Grammar Transformations2. Elimination of Left Recursion

Grammar Transformations2. Elimination of Left Recursion

• A grammar is left recursive if it has a nonterminal A and a derivation AAfor some stringTop-down parsing methods (to be discussed shortly) cannot handle left-recursive grammars, so a transformation to eliminate left recursion is needed.

• Immediate left recursion (productions of the form AA ) can be easily eliminated.

• We group the A-productions as

AA 1 | A 2 | … | A m | 1| 2 | … | n

where no i begins with A. Then we replace the A-productions by

A1 A’ | 2 A’ | … | n A’

A’ 1 A’ | 2 A’| … | m A’ |

ParsingParsingParsingParsing2323

Grammar Transformations2. Elimination of Left Recursion (Cont.)

Grammar Transformations2. Elimination of Left Recursion (Cont.)

• The previous transformation, however, does not eliminate left recursion involving two or more steps. For example, consider the grammar

SAa | b

AAc| Sd |

S is left-recursive because SAaSdabut it is not immediately left recursive.

ParsingParsingParsingParsing2424

Grammar Transformations2. Elimination of Left Recursion (Cont.)

Grammar Transformations2. Elimination of Left Recursion (Cont.)

Algorithm. Eliminate left recursion

Arrange nonterminals in some order A1, A2 ,,…, An

for i =1 to n {

for j =1 to i -1 {

replace each production of the form AiAj by the production Ai1 | 2 | … | n where Aj1 | 2 |…| n are all the current Aj-productions

}

eliminate the immediate left recursion among the Ai-productions

}

ParsingParsingParsingParsing2525

Grammar Transformations2. Elimination of Left Recursion (Cont.)

Grammar Transformations2. Elimination of Left Recursion (Cont.)

• To show that the previous algorithm actually works all we need notice is that iteration i only changes productions with Ai on the left-hand side. And m > i in all productions of the form AiAm .

• This can be easily shown by induction. – It is clearly true for i=1.

– If it is true for all i<k, then when the outer loop is executed for i=k, the inner loop will remove all productions AiAm with m < i.

– Finally, with the elimination of self recursion, m in the AiAm productions is forced to be > i.

• So, at the end of the algorithm, all derivations of the form AiAmwill have m > i and therefore left recursion would not be possible.

ParsingParsingParsingParsing2626

Grammar Transformations3. Left Factoring

Grammar Transformations3. Left Factoring

• Left factoring helps transform a grammar for predictive parsing

• For example, if we have the two productions

S if then S else S

| if then S

• on seeing the input token if, we cannot immediately tell which production to choose to expand S.

• In general, if we have A 1 | 2 and the input begins with , we do not know (without looking further) which production to use to expand A.

ParsingParsingParsingParsing2727

Grammar Transformations3. Left Factoring(Cont.)

Grammar Transformations3. Left Factoring(Cont.)• However, we may defer the decision by expanding

A to A’.

• Then after seeing the input derived from , we may expand A’ to 1 or to 2. That is, left-factored, the original productions become

A A’

A’1 | 2

ParsingParsingParsingParsing2828

Non-Context-Free Language ConstructsNon-Context-Free Language Constructs

• Examples of non-context-free languages are:– L1={wcw | w is of the form (a|b)*}

– L2={anbmcndm | n 1 and m 1 }

– L3={anbncn | n 0 }

• Languages similar to these that are context free– L’1={wcwR | w is of the form (a|b)*} (wR stands for w reversed) This language is generated by the grammar

» S aSa | bSb | c

– L’2={anbmcmdn | n 1 and m 1 } This language is generated by the grammar

» S aSd | aAd» A bAc | bc

ParsingParsingParsingParsing2929

Non-Context-Free Language Constructs (Cont.)Non-Context-Free Language Constructs (Cont.)

– L”2={anbncmdm | n 1 and m 1 } This language is generated by the grammar

» S AB» A aAb | ab» B cBd | cd

– L’3={anbn | n 1} This language is generated by the grammar

» S aSb | ab This language is not definable by any regular expression

ParsingParsingParsingParsing3030

Non-Context-Free Language Constructs (Cont.)Non-Context-Free Language Constructs (Cont.)

• Suppose we could construct a DFSM D accepting L’3.

• D must have a finite number of states, say k.

• Consider the sequence of states s0, s1, s2, …, sk entered by D having read , a, aa, …, ak.

• Since D only has k states, two of the states in the sequence have to be equal. Say, si sj (ij).

• From si, a sequence of i bs leads to an accepting (final) state. Therefore, the same sequence of i bs will also lead to an accepting state from sj. Therefore D would accept ajbi which means that the language accepted by D is not identical to L’3. A contradiction.

ParsingParsingParsingParsing3131

ParsingParsing

The parsing problem is: Given string of tokens w, find a parse tree whose frontier is w. (Equivalently, find a derivation from w.)

A parser for a grammar G reads a list of tokens and finds a parse tree if they form a sentence (or reports an error otherwise)

Two classes of algorithms for parsing:– Top-down

– Bottom-up

ParsingParsingParsingParsing3232

Parser generatorsParser generators

A parser generator is a program that reads a grammar and produces a parser.

The best known parser generator is yacc. Both produce bottom-up parsers.

Most parser generators - including yacc - do not work for every cfg; they accept a restricted class of cfg’s that can be parsed efficiently using the method employed by that parser generator.

ParsingParsingParsingParsing3333

Top-down parsingTop-down parsing

• Starting from parse tree containing just S, build tree down toward input. Expand left-most non-terminal.

• Algorithm: (next slide)

ParsingParsingParsingParsing3434

Top-down parsing (cont.)Top-down parsing (cont.)

• Let input = a1a2...an

current sentential form (csf) = S

loop {

suppose csf = t1...tkA

if t1...tk a1...ak , it’s an error

based on ak+1..., choose production A

csf becomes t1...tk}

ParsingParsingParsingParsing3535

Top-down parsing exampleTop-down parsing example

Grammar: H: L E ; L | E E a | b

Input: a;b

Parse tree Sentential form Input

L L a;b

E;LLE L;

a;b

LE L;

a

a;L a;b

ParsingParsingParsingParsing3636

Top-down parsing example (cont.)Top-down parsing example (cont.)

Parse tree Sentential form Input

LE L;

a E

a;E a;b

LE L;

a E

b

a;b a;b

ParsingParsingParsingParsing3737

LL(1) parsingLL(1) parsing

Efficient form of top-down parsing.

Use only first symbol of remaining input (ak+1) to choose next production. That is, employ a function M: N P in “choose production” step of algorithm.

When this works, grammar is (usually) called LL(1). (More precise definition to follow.)

ParsingParsingParsingParsing3838

LL(1) examplesLL(1) examples

• Example 1: H: L E ; L | E

E a | b

Given input a;b, so next symbol is a.

Which production to use? Can’t tell.

H not LL(1).

ParsingParsingParsingParsing3939

LL(1) examplesLL(1) examples• Example 2:

Exp Term Exp’

Exp’ $ | + Exp

Term id(Use $ for “end-of-input” symbol.)

Grammar is LL(1): Exp and Term have only one production; Exp’ has two productions but only one is applicable at any time.

ParsingParsingParsingParsing4040

Nonrecursive predictive parsingNonrecursive predictive parsing

• It is possible to build a nonrecursive predictive parser by maintaining as stack explicitly, rather tan implicitly via recursive calls.

• The key problem during predictive parsing is that of determining the production to be applied for a non-terminal.

ParsingParsingParsingParsing4141

Nonrecursive predictive parsingNonrecursive predictive parsing Algorithm. Nonrecursive predictive parsing

Set ip to point to the first symbol of w$.

repeat

Let X be the top of the stack symbol and a the symbol pointed to by ip

if X is a terminal or $ then

if X == a then

pop X from the stack and advance ip

else error()

else // X is a nonterminal

if M[X,a] == XY1 Y2 … Y k then

pop X from the stack

push YkY k-1, …, Y1 onto the stack with Y1 on top

(push nothing if Y1 Y2 … Y k is ) output the production XY1 Y2 … Y k

else error()

until X == $

ParsingParsingParsingParsing4242

LL(1) grammarsLL(1) grammars

• No left recursion.A A : If this production is chosen, parse makes no progress.

• No common prefixes.A |

Can fix by “left factoring”:A A’

’|

ParsingParsingParsingParsing4343

LL(1) grammars (cont.)LL(1) grammars (cont.)

• No ambiguity.Precise definition requires that production to choose be unique

(“choose” function M very hard to calculate otherwise).

ParsingParsingParsingParsing4444

Top-down ParsingTop-down Parsing

Input tokens: <t0,t1,…,t-i,...>L

E0 … E-n

Start symbol androot of parse tree

Input tokens: <t-i,...>L

E0 … E-n

...From left to right,“grow” the parsetree downwards

ParsingParsingParsingParsing4545

Checking LL(1)-nessChecking LL(1)-ness

• For any sequence of grammar symbols , define set FIRST() to be those tokens a such that … a for some .

(Notation: write *a.)

ParsingParsingParsingParsing4646

Checking LL(1)-nessChecking LL(1)-ness

• Define: Grammar G = (N, , P, S) is LL(1) if whenever there are two left-most derivations (in which the leftmost non-terminal is always expanded first )

S =>* wA => w =>* wx

S =>* wA => w =>* wy

Such that FIRST(x) = FIRST(y), it follows that =.

In other words, given

1. A string wA in V* and

2. The first terminal symbol to be derived from A, say t

There is at most one production that can be applied to A to

yield a derivation of any terminal string beginning with wt.

• FIRST sets can often be calculated by inspection.

ParsingParsingParsingParsing4747

FIRST SetsFIRST Sets

Exp Term Exp’Exp’ $ | + Exp Term id

(Use $ for “end-of-input” symbol.)

• FIRST(Term Exp’) = {id}• FIRST($) = {$}, FIRST(+ Exp) = {+} implies

FIRST($) FIRST(+ Exp) = {}• FIRST(id) = {id}

grammar is LL(1)

ParsingParsingParsingParsing4848

FIRST SetsFIRST Sets

H: L E ; L | E E a | b

FIRST(E ; L) = {a,b} = FIRST(E)FIRST(E ; L) FIRST(E) {} H not LL(1).

ParsingParsingParsingParsing4949

How to compute FIRST Sets of Vocabulary SymbolsHow to compute FIRST Sets of Vocabulary Symbols

Algorithm. Compute FIRST(X) for all grammar symbols X

forall X V do FIRST(X)={}

forall X (X is a terminal) do FIRST(X)={X}

forall productions X do FIRST(X) = FIRST(X) U {} repeat

forall productions XY1 Y2 … Y k do

forall i [1,k] do

FIRST(X) = FIRST(X) U (FIRST(Yi) - {}) if FIRST(Y i ) then continue outer loop

FIRST(X) = FIRST(X) U {} until no more terminals or are added to any FIRST set

ParsingParsingParsingParsing5050

How to compute FIRST Sets of Strings of SymbolsHow to compute FIRST Sets of Strings of Symbols

• FIRST(X1X2…Xn) is the union of FIRST(X1) and all FIRST(Xi) such that FIRST(X k ) for k=1,2,..,i-1

• FIRST(X1X2…Xn) contains iff FIRST(X k ) for k=1,2,..,n.

ParsingParsingParsingParsing5151

FIRST Sets do not SufficeFIRST Sets do not Suffice

• Given the productions A T x

A T y Tw

T• Tw should be applied when the next input token is

w.• Tshould be applied whenever the next terminal (the

one pointed to by ip) is either x or y

ParsingParsingParsingParsing5252

FOLLOW SetsFOLLOW Sets

• For any nonterminal X, define set FOLLOW(X) to be those tokens a such that S *Xafor some and.

ParsingParsingParsingParsing5353

How to compute the FOLLOW SetHow to compute the FOLLOW Set

Algorithm. Compute FOLLOW(X) for all nonterminals X

FOLLOW(S) ={$}

forall productions A B do FOLLOW(B)=Follow(B) U (FIRST() - {}) repeat

forall productions A B or A B with FIRST() do

FOLLOW(B) = FOLLOW(B) U FOLLOW(A)

until all FOLLOW sets remain the same

ParsingParsingParsingParsing5454

Construction of a predictive parsing tableConstruction of a predictive parsing table

Algorithm. Construction of a predictive parsing table

M[:,:] = {}

forall productions A do

forall a FIRST() do

M[A,a] = M[A,a] U {A }

if FIRST() then

forall b FOLLOW(A) do

M[A,b] = M[A,b] U {A }

Make all empty entries of M be error

ParsingParsingParsingParsing5555

Another Definition of LL(1)Another Definition of LL(1)

Define: Grammar G is LL(1) if for every A N with

productions A 1n

FIRST(i FOLLOW(A)) FIRST(j FOLLOW(A) ) =

for all i, j;

Recommended