44
Ambiguity Management in Ambiguity Management in Deep Grammar Engineering Deep Grammar Engineering Tracy Holloway King Tracy Holloway King

Ambiguity Management in Deep Grammar Engineering

  • Upload
    cutler

  • View
    56

  • Download
    0

Embed Size (px)

DESCRIPTION

Ambiguity Management in Deep Grammar Engineering. Tracy Holloway King. Ambiguity: bug or feature?. Bug in computer programming languages Feature in natural language People good at resolving ambiguity in context Ambiguity consequently often unperceived “Readjust paper holding clip” - PowerPoint PPT Presentation

Citation preview

Page 1: Ambiguity Management in  Deep Grammar Engineering

Ambiguity Management in Ambiguity Management in Deep Grammar EngineeringDeep Grammar Engineering

Tracy Holloway KingTracy Holloway King

Page 2: Ambiguity Management in  Deep Grammar Engineering

Ambiguity: bug or feature?Ambiguity: bug or feature? Bug in computer programming languages Feature in natural language

– People good at resolving ambiguity in context– Ambiguity consequently often unperceived

“Readjust paper holding clip”

even though thousand-fold ambiguities are common– Ambiguity promotes conciseness

Computers can’t resolve ambiguity like humans

If we are going to build large-scale, linguistically sophisticated grammars, we need ways to handle ambiguity

Page 3: Ambiguity Management in  Deep Grammar Engineering

Talk OutlineTalk Outline

Sources of ambiguity Grammar engineering approaches

– Shallow markup– (Dis)preference marks

Stochastic disambiguation Efficiency in ambiguity management

Page 4: Ambiguity Management in  Deep Grammar Engineering

Sources of AmbiguitySources of Ambiguity Phonetic:

– “I scream” or “ice cream” Tokenization:

– “I like Jan.” --- |Jan|. Or |Jan.|. (abbrev January)

Morphological: – “walks” --- plural noun or 3sg verb– “untieable knot” --- un(tieable) or (untie)able

Lexical:– “bank” --- river bank or financial institution

Syntactic:– “The turkeys are ready to eat.” --- fattened or hungry

Semantic:– “Two boys ate fifteen pizzas.” --- 15 each or 15 total

Pragmatic:– “Sue won. Ed gave her a good luck charm.” --- cause or result

Page 5: Ambiguity Management in  Deep Grammar Engineering

PP AttachmentPP AttachmentA classic example of syntactic ambiguityA classic example of syntactic ambiguity

PP adjuncts can attach to VPs and NPs Strings of PPs in the VP are ambiguous

– I see the girl with the telescope.

I see [the girl with the telescope]. I see [the girl] [with the telescope].

Ambiguities proliferate exponentially– I see the girl with the telescope in the park

I see [the girl with [the telescope in the park]]I see [the [girl with the telescope] in the park]I see the girl [with the [telescope in the park]]I see the girl [with the telescope] [in the park]I see [the girl with the telescope] [in the park]

– The syntax has no way to determine the attachment, even if humans can.

Page 6: Ambiguity Management in  Deep Grammar Engineering

Coverage entails ambiguityCoverage entails ambiguity

I fell in the park.

+I know the girl in the park.

I see the girl in the park.

Page 7: Ambiguity Management in  Deep Grammar Engineering

Ambiguity can be explosiveAmbiguity can be explosiveIf alternatives multiply within or across components…

Tokenize

Morphology

Syntax

Sem

antics

Discourse

Page 8: Ambiguity Management in  Deep Grammar Engineering

Ambiguity figuresAmbiguity figures

Deep grammars are massively ambiguous Example: 700 from section 23 of WSJ

– average # of words: 19.6– average # of optimal parses: 684

» for 1-10 word sentences: 3.8» for 11-20 word sentences: 25.2» for 50-60 word sentences: 12,888

Page 9: Ambiguity Management in  Deep Grammar Engineering

Managing AmbiguityManaging Ambiguity

Grammar engineering approaches– Trim early with shallow markup– (Dis)preference marks on rules

Choose most probable parse for applications that need a single input

Use packing to parse and manipulate the ambiguities efficiently

Page 10: Ambiguity Management in  Deep Grammar Engineering

Talk OutlineTalk Outline

Sources of ambiguity Grammar engineering approaches

– Shallow markup– (Dis)preference marks

Stochastic disambiguation Efficiency in ambiguity management

Page 11: Ambiguity Management in  Deep Grammar Engineering

Shallow markupShallow markup Part of speech marking as filter

I saw her duck/VB.– accuracy of tagger (v. good for English)– can use partial tagging (verbs and nouns)

Named entities– <company>Goldman, Sachs & Co.</company> bought IBM.– good for proper names and times– hard to parse internal structure

Fall back technique if fail– slows parsing– accuracy vs. speed

Page 12: Ambiguity Management in  Deep Grammar Engineering

Example shallow markup: Named entitiesExample shallow markup: Named entities

Allow tokenizer to accept marked up input:

parse {<person>Mr. Thejskt Thejs</person> arrived.}

tokenized string:

Mr. Thejskt Thejs TB +NEperson Mr(TB). TB Thejskt TB Thejs

TB arrived TB . TB

Add lexical entries and rules for NE tags

Page 13: Ambiguity Management in  Deep Grammar Engineering

Resulting C-structureResulting C-structure

Page 14: Ambiguity Management in  Deep Grammar Engineering

Resulting F-structureResulting F-structure

Page 15: Ambiguity Management in  Deep Grammar Engineering

Results for shallow markupResults for shallow markup

Full/All % Full

parses

Optimalsol’ns

Best

F-sc

Time

%

Unmarked 76 482/1753 82/79 65/100

Named ent 78 263/1477 86/84 60/91

POS tag 62 248/1916 76/72 40/48

Kaplan and King 2003

Page 16: Ambiguity Management in  Deep Grammar Engineering

(Dis)preference marks (OT marks)(Dis)preference marks (OT marks)

Want to (dis)prefer certain constructions– prefer: use when possible– disprefer: do not use unless no other analysis

Implementation– Put marks in rules and lexical entries– Rank those marks

» ranking can be different for different grammars/corpora

– Use most prefered parse(s)» can use as a two pass system for robust parsing

Page 17: Ambiguity Management in  Deep Grammar Engineering

Ungrammatical inputUngrammatical input

Real world text contains ungrammatical input– Deep grammars tend to only cover grammatical output

Common errors can be coded in the rules– may want to know that error occurred

(e.g., provide feedback in CALL grammars)

Disprefer parses of ungrammatical structures– tools for grammar writer to rank rules– two+ pass system

1. standard rules

2. rules for known ungrammatical constructions

3. default fall back rules

Page 18: Ambiguity Management in  Deep Grammar Engineering

Sample ungrammatical structuresSample ungrammatical structures

Mismatched subject-verb agreement Verb3Sg = { SUBJ PERS = 3 SUBJ NUM = sg |BadVAgr }

Missing copula VPcop ==> { Vcop: ^=! |e: (^ PRED)='NullBe<(^ SUBJ)(^XCOMP)>' MissingCopularVerb} { NP: (^ XCOMP)=! |AP: (^ XCOMP)=! | …}

Page 19: Ambiguity Management in  Deep Grammar Engineering

Dispreferred grammatical structuresDispreferred grammatical structures

Prefer subcategorized infinitives to adverbials– I want it. I finished up (in order) to leave.– I want it to leave.

VP --> V (NP: (^ OBJ)=!) (VPinf: { (^ XCOMP)=! +InfSubcat |! $ (^ ADJUNCT) InfAdjunct } ).

Post-copular gerunds– He is a boy. (His) going is difficult.– He is going.

Page 20: Ambiguity Management in  Deep Grammar Engineering

OT Mark summaryOT Mark summary

Use (dis)preference marks to (dis)prefer constructions or words

Allows inclusion of marginal/ungrammatical constructions

Issues:– Only works with ambiguities with known

preferences (not PP attachment)– Hard to determine ranking for many marks– Two-pass parsing can be slow

Page 21: Ambiguity Management in  Deep Grammar Engineering

Talk OutlineTalk Outline

Sources of ambiguity Grammar engineering approaches

– Shallow markup– (Dis)preference marks

Stochastic disambiguation Efficiency in ambiguity management

Page 22: Ambiguity Management in  Deep Grammar Engineering

Packing & Pruning in XLEPacking & Pruning in XLE

XLE produces (too) many candidates– All valid (with respect to grammar and OT marks)– Not all equally likely– Some applications require a single best parse

or at most just a handful (n best)

Grammar writer can’t specify correct choices– Many implicit properties of words and structures with

unclear significance

Page 23: Ambiguity Management in  Deep Grammar Engineering

Pruning in XLEPruning in XLE

Appeal to probability model to choose best parse Assume: previous experience is a good guide for

future decisions Collect corpus of training sentences, build

probability model that optimizes for previous good results– partially labelled training data is ok [NP-SBJ They] see [NP-OBJ the girl with the telescope]

Apply model to choose best analysis of new sentences– efficient (XLE English grammar: 5% of parse time)

Page 24: Ambiguity Management in  Deep Grammar Engineering

Exponential models are appropriateExponential models are appropriate(aka Maximum Entropy or Log-linear models)(aka Maximum Entropy or Log-linear models)

Assign probabilities to representations, not to choices in a derivation

No independence assumption Arithmetic combined with human insight

– Human:» Define properties of representations that may be relevant» Based on any computable configuration of features,

trees

– Arithmetic:» Train to figure out the weight of each property

Page 25: Ambiguity Management in  Deep Grammar Engineering

Properties employed in WSJ ExperimentProperties employed in WSJ Experiment

~800 property-functions:– c-structure nodes and subtrees– recursively embedded phrases– f-structure attributes (grammatical functions)– atomic attribute-value pairs– left/right branching– (non)parallelism in coordination– lexical elements (subcategorization frames)

Some end up with no discrimination power after training

Page 26: Ambiguity Management in  Deep Grammar Engineering

Stochastic Disambiguation SummaryStochastic Disambiguation Summary

Training:– Define a set of features by hand– Train on partially labelled data– Can train on low-ambiguity data

Use:– Choose just one structure for applications that

want just one– XLE displays most probable first– 5% of parse time to disambiguate– 30% gain in F-score

Page 27: Ambiguity Management in  Deep Grammar Engineering

Talk OutlineTalk Outline

Sources of ambiguity Grammar engineering approaches

– Shallow markup– (Dis)preference marks

Stochastic disambiguation Efficiency in ambiguity management

Page 28: Ambiguity Management in  Deep Grammar Engineering

Computational consequences of ambiguityComputational consequences of ambiguity

Serious problem for computational systems– Broad coverage, hand written grammars frequently produce

thousands of analyses, sometimes millions– Machine learned grammars easily produce hundreds of

thousands of analyses if allowed to parse to completion

Three approaches to ambiguity management:– Pruning: block unlikely analysis paths early– Procrastination: do not expand analysis paths that will lead

to ambiguity explosion until something else requires them» Also known as underspecification

– Packing: compact representation and computation of all possible analyses

Page 29: Ambiguity Management in  Deep Grammar Engineering

The Problem with Pruning:The Problem with Pruning: premature disambiguationpremature disambiguation

The conventional approach: Use heuristics to prune as soon as possible

Tokenize

Morphology

Syntax

Sem

antics

Discourse

XX

X

Statistics

X

Strong constraints may reject the so-far-best (= only) option

Fast computation, wrong result

X

Page 30: Ambiguity Management in  Deep Grammar Engineering

The problem with procrastination:The problem with procrastination: passing the buckpassing the buck

Chunk parsing as an example:– Collect noun groups, verb groups, PP groups– Leave it to later processing to figure out the

correct way of putting these together– Not all combinations are grammatically acceptable

Later processing must either– Call parser to check grammatical constraints– Have its own model of grammatical constraints– In the best case, solve a set of constraints the

partial parser includes with its output

Page 31: Ambiguity Management in  Deep Grammar Engineering

The Problem with PackingThe Problem with Packing

There may be too many analyses to pack efficiently

A major problem for relatively unconstrained machine induced grammars– Grammars overgenerate massively– Statistics used to prune out unlikely sub-analyses

Less of a problem for carefully hand-coded broad coverage grammars

Page 32: Ambiguity Management in  Deep Grammar Engineering

PackingPacking

Explosion of ambiguity results from a small number of sub-analyses combining in different ways to produce a large number of total analyses (e.g. PP attachment)

Compute and represent each sub-analysis just once

Compute a factored representation of how these sub-analyses combine

Page 33: Ambiguity Management in  Deep Grammar Engineering

Generalizing Free Choice PackingGeneralizing Free Choice Packing

The sheep saw the fish.How many sheep?

How many fish?

The sheep-sg saw the fish-sg.The sheep-pl saw the fish-sg.The sheep-sg saw the fish-pl.The sheep-pl saw the fish-pl.

Options multiplied out

In principle, a verb might require agreement of Subject and Object: Have to check it out.

The sheep saw the fishsgpl

sgpl

Options packed

But English doesn’t do that:Any combination of choices is OK

Page 34: Ambiguity Management in  Deep Grammar Engineering

Dependent choicesDependent choices

Das Mädchen-nom sah die Katze-nomDas Mädchen-nom sah die Katze-accDas Mädchen-acc sah die Katze-nomDas Mädchen-acc sah die Katze-acc

badThe girl saw the catThe cat saw the girl bad

Das Mädchen sah die Katzenomacc

nomacc

The girl saw the cat

Again, packing avoids duplication … but it’s wrongIt doesn’t encode all dependencies, choices are not free.

Page 35: Ambiguity Management in  Deep Grammar Engineering

Solution: Label dependent choicesSolution: Label dependent choices

Das Mädchen-nom sah die Katze-nomDas Mädchen-nom sah die Katze-accDas Mädchen-acc sah die Katze-nomDas Mädchen-acc sah die Katze-acc

badThe girl saw the catThe cat saw the girl bad

• Label each choice with distinct Boolean variables p, q, etc.• Record acceptable combinations as a Boolean expression • Each analysis corresponds to a satisfying truth-value assignment

(a line from ’s truth table that assigns it “true”)

Das Mädchen sah die Katze p:nom

p:acc

q:nom

q:acc

(pq)

(pq) =

Page 36: Ambiguity Management in  Deep Grammar Engineering

The Free Choice GambleThe Free Choice Gamble

Worst case, where everything interacts:– As many choice variables as there are readings– Packing blows up, and becomes exponential

Best case, no interactions– N completely independent choices represent 2N

readings

Language interactions mostly limited & local– Tends towards the best case– Free choice packing pays off for linguistic analysis

Page 37: Ambiguity Management in  Deep Grammar Engineering

ConclusionsConclusions

Ambiguity has to be dealt with Deep grammars use a variety of approaches

– preprocessing– grammar engineering– stochastic disambiguation

Why use deep grammars if they are so ambiguous?

Page 38: Ambiguity Management in  Deep Grammar Engineering

Deep analysis matters…Deep analysis matters… if you care about the answerif you care about the answer

Example:

A delegation led by Vice President Philips, head of the chemical division, flew to Chicago a week after the incident.

Question: Who flew to Chicago?

Candidate answers:

division closest nounhead next closestV.P. Philips next

shallow but wrong

delegation furthest away butSubject of flew

deep and right

Page 39: Ambiguity Management in  Deep Grammar Engineering

Applications of Language EngineeringApplications of Language Engineering

Functionality

Do

ma

in C

ove

rag

e

Low

Nar

row

Bro

ad

High

Alta Vista

AskJeeves

Google

Post-SearchSifting

AutonomousKnowledge Filtering

NaturalDialogue

KnowledgeFusion

Microsoft Paperclip

Manually-tagged Keyword Search

Document BaseManagement

RestrictedDialogue

Useful Summary

Good Translation

Page 40: Ambiguity Management in  Deep Grammar Engineering
Page 41: Ambiguity Management in  Deep Grammar Engineering

What to do with them?What to do with them?

Define yes-no / 1-0 features, f, that seem important Training determines weights on these features, λ, to

reflect their actual importance Select parse x: count occurrences of features (0,1)

and multiply by corresponding weights, λ.f(x) Convert weighted feature counts to probabilities

eλ.f(x)

eλ.f(X)

Un-normalized probabilityNormalizing factor

Page 42: Ambiguity Management in  Deep Grammar Engineering

Issues in Stochastic DisambiguationIssues in Stochastic Disambiguation

What kind of probability model? What kind of training data? Efficiency of training, efficiency of

disambiguation? Benefit vs. random choice of parse

Page 43: Ambiguity Management in  Deep Grammar Engineering

Advantages of Free Choice PackingAdvantages of Free Choice Packing

Avoids procrastination– Nogoods are constraints that parser sends to other component– Eliminating nogoods: other components don’t do parser’s work

Independence between choices:Allows processing relying on independence assumptions– Counting number of readings

» Apparently trivial but of crucial importance, since statistical modelling requires the ability to count

– Hence, statistical processing

A general mechanism extending beyond parsing

Page 44: Ambiguity Management in  Deep Grammar Engineering

Simplifying Truth TablesSimplifying Truth Tables

Das Mädchen sah die Katze p:nom

p:acc

q:nom

q:acc

(pq)

(pq) =

p q 1 1 01 0 10 1 10 0 0

(q = p)

Das Mädchen sah die Katze p:nom

p:acc p:nom

p:acc (p p) =

p 1 10 1

Freely choose any linefrom the truth table