62
Semantics

Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

  • View
    233

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Semantics

Page 2: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Where are we in the “Big Picture”

Morph Analysis

Parsing

WORLD of FACTS

Inference Engine

Semantic Interpreter

ASRSpeech

Text

Syntactic Parse

SemanticRepresentation

Page 3: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Semantic Representation

Syntactic representation, • phrases and tree structures

• dependency information between words

Semantic representation• What’s the purpose of this representation?

– Interface between syntactic information and the inference engine

Requirements on the semantic representation• Supports inference

– Every CEO is wealthy and Gates is a CEO Gates is wealthy

• Normalizes syntactic variations– Delta serves NYC == NYC is served by Delta

• Has the capacity of representing the distinctions in language phenomena– John believes Delta serves NYC ≠ Delta serves NYC

• Unambiguous representation– John wants to eat someplace close to the university

Page 4: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Mechanisms for Expressing Meaning

Linguistic means for expressing meaning

• Words: lexical semantics and word senses– Delta will serve NYC– This flight will serve peanuts– John will serve as CEO

• Syntactic information: predicate-argument structure– John wants to eat a cake– John wants Mary to eat a cake– John wants a cake

• Prosodic information in Speech– Legumes are a good source of vitamins

• Gesture information in multimodal communication

Page 5: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

First-order Predicate Calculus: A refresher

A formal system used to derive new propositions and verify their truth given a world.

Syntax of FOPC• Formulae: quantifiers and connectives on predicates

• Predicates: n-ary predications of facts and relations

• Terms: constants, variables and functions

World: Truth assignments to formulae

Inference:• Modus ponens

– Every CEO is wealthy : ∀x CEO(x) wealthy(x)– Gates is a CEO : CEO(Gates)– Derives: wealthy(Gates)

Given a world, determining the truth value of a formula is a search process – backward chaining and forward chaining• Much like the top-down and bottom-up parsing algorithms.

Page 6: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Logic for Language

Representations for different aspects of language.• Entities

– Delta, Gates, AT&T

• Categories– restaurants, airlines, students

• Events– I ate lunch. I ate at my desk I ate lunch at my desk

• Time (utterance time, reference time, event time)– I ate lunch when the flight arrived– I had eaten lunch when the flight arrived

• Aspect– Stative, activity, achievement and accomplishment

• Quantification– Every person loves some movie

• Predication– John is a teacher

• Modal operators– John believes Mary went to the movies

Page 7: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Linking Syntax and Semantics

How to compute semantic representations from syntactic trees?

We could have one function for each syntactic tree that maps it to its semantic representation.• Too many such functions• Not all aspects of the tree might be needed for its semantics

Meaning derives from • The people and activities represented (predicates and arguments, or, nouns and verbs)

• The way they are ordered and related: syntax of the representation, which may also reflect the syntax of the sentence

Compositionality Assumption: The meaning of the whole sentence is composed of the meaning of its parts.• George cooks. Dan eats. Dan is sick.

• Cook(George) Eat(Dan) Sick(Dan)

• If George cooks and Dan eats, Dan will get sick.

(Cook(George) ^ eat(Dan)) Sick(Dan)

The trick is to decide on what the size of the part should be.• Rule-by-rule hypothesis

Page 8: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Linking Syntax and Semantics – contd.

Compositionality:

• Augment the lexicon and the grammar (as we did with feature structures)

• Devise a mapping between rules of the grammar and rules of semantic representation

• For CFGs, this amounts to a Rule-to-Rule Hypothesis

Each grammar rule is embellished with instructions on how to map the components of the rule to a semantic representation.

S NP VP {VP.sem(NP.sem)}

Each semantic function is defined in terms of the semantic representation of choice.

Page 9: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Syntax-Driven Semantics

S : fly(birds)

NP : birds VP : fly

N : birds

birds

V : fly

fly

There are still a few free parameters:

a. What should the semantic representation of each component be?

b. How should we combine the component representations?

Depends on what the final representation we want.

Page 10: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

A Simple Example

McDonald’s serves burgers.

Associating constants with constituents

• ProperNoun McDonald’s {McDonald’s}

• PlNoun burgers {burgers}

Defining functions to produce these from input

• NP ProperNoun {ProperNoun.sem}

• NP PlNoun {PlNoun.sem}

• Assumption: meaning representations of children are passed up to parents for non-branching constituents

Verbs are where the action is

Page 11: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

• V serves {∃(e,x,y) Isa(e,Serving) ^ Server(e,x) ^ Served(e,y)} where e = event, x = agent, y = patient

• Will every verb have its own distinct representation?– McDonald’s hires students.– McDonald’s gave customers a bonus.– Predicate(Agent, Patient, Beneficiary)

Once we have the semantics for each constituent, how do we combine them?• VP V NP {V.sem(NP.sem)}

• Goal for VP semantics: E(e,x) Isa(e,Serving) ^ Server(e,x) ^ Served(e,burgers)

• VP.sem must tell us– Which variables to be replaced by which arguments– How this replacement is done

Page 12: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Lambda Notation

Extension to First Order Predicate Calculus x P(x)

• + variable(s) + FOPC expression in those variables

Lambda binding

• Apply lambda-expression to logical terms to bind lambda-expression’s parameters to terms (lambda reduction)

• Simple process: substitute terms for variables in lambda expression xP(x) (car) P(car)

Page 13: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Lambda Abstraction and Application

Abstraction: Make variable in the body available for binding.

• to external arguments provided by semantics of other constituents (e.g. NPs)

Application: Substitute the bound variable with the value

Semantic attachment for

• V serves {V.sem(NP.sem)}

{∃(e,x,y) Isa(e,Serving) ^ Server(e,y) ^ Served(e,x)} converts to the lambda expression:

{x ∃ (e,y) Isa(e,Serving) ^ Server(e,y) ^ Served(e,x)}

• Now ‘x’ is available to be bound when V.sem is applied to NP.sem of direct object (V.sem(NP.sem))

• application binds x to value of NP.sem (burgers)

• Value of VP.sem becomes:

{∃(e,y) Isa(e,Serving) ^ Server(e,y) ^ Served(e,burgers)}

Page 14: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Lambda Abstraction and Application – contd.

Similarly, we need a semantic attachment for S NP VP {VP.sem(NP.sem)} to add the subject NP to our semantic representation of McDonald’s serves burgers

• Back to V.sem for serves

• We need another -abstraction in the value of VP.sem

• Change semantic representation of V to include another argument to be bound later

V serves {x y ∃(e) Isa(e,Serving) ^ Server(e,y) ^ Served(e,x)}

Value of VP.sem becomes:

{y ∃(e) Isa(e,Serving) ^ Server(e,y) ^ Served(e,burgers)}

Value of S.sem becomes:

{∃(e) Isa(e,Serving) ^ Server(e,McDonald’s) ^ Served(e,burgers)}

Page 15: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Several Complications

For example, terms can be complexA restaurant serves burgers.

• ‘a restaurant’: ∃x Isa(x,restaurant)

• E e Isa(e,Serving) ^ Server(e,< ∃x Isa(x,restaurant)>) ^ Served(e,burgers)

• Allows quantified expressions to appear where terms can by providing rules to turn them into well-formed FOPC expressions

Issues of quantifier scope

Every restaurant serves burgers.

Every restaurant serves every burger.

Page 16: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Semantic representations for other constituents?

• Adjective phrases: – Happy people, cheap food, purple socks– intersective semantics

Nom Adj Nom {x Nom.sem(x) ^ Isa(x,Adj.sem)}

Adj cheap {Cheap}

x Isa(x, Food) ^ Isa(x,Cheap) …works ok …

But….fake gun? Local restaurant? Former friend? Would-be singer?

Ex Isa(x, Gun) ^ Isa(x,Fake)

Page 17: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Doing Compositional Semantics

Incorporating compositional semantics into CFG requires:

• Right representation for each constituent based on the parts of that constituent (e.g. Adj)

• Right representation for a category of constituents based on other grammar rules, making use of that constituent (e.g. V.sem)

This gives us a set of function-like semantic attachments incorporated into our CFG

• E.g. Nom Adj Nom {x Nom.sem(x) ^ Isa(x,Adj.sem)}

A number of formalisms that extend CFGs to allow larger compositionality domains.

Page 18: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Computing the Semantic Representation

Two approaches:

• Compute the semantic representation of each constituent as the parser progresses through the rules.– Semantic representations could be used to rule out parses– Wasted time in constructing semantics for unused constituents.

• Let the parser complete the syntactic parse and then recover the semantic representation.– in a bottom-up traversal.

Issues of ambiguous syntactic representation

• Packing ambiguity

• Underspecified semantics.

Page 19: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Non-Compositional Language

Non-compositional modifiers: fake, former, local

Metaphor:

• You’re the cream in my coffee. She’s the cream in George’s coffee.

• The break-in was just the tip of the iceberg.

• This was only the tip of Shirley’s iceberg.

Idioms:

• The old man finally kicked the bucket.

• The old man finally kicked the proverbial bucket.

Deferred reference: The ham sandwich wants his check.

Solutions? Mix lexical items with special grammar rules? Or???

Page 20: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Lexical Semantics

Lexical Semantics

Page 21: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Thinking about Words Again

Lexeme: an entry in the lexicon that includes

• an orthographic representation

• a phonological form

• a symbolic meaning representation or sense

Some typical dictionary entries:

• Red (‘red) n: the color of blood or a ruby

• Blood (‘bluhd) n: the red liquid that circulates in the heart, arteries and veins of animals

Page 22: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

• Right (‘rIt) adj: located nearer the right hand esp. being on the right when facing the same direction as the observer

• Left (‘left) adj: located nearer to this side of the body than the right

Can we get semantics directly from online dictionary entries?• Some are circular

• All are defined in terms of other lexemes

• You have to know something to learn something

What can we learn from dictionaries?• Relations between words:

– Oppositions, similarities, hierarchies

Page 23: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

HomonomyHomonyms: Words with same form – orthography and pronunciation -- but different, unrelated meanings, or senses (multiple lexemes)• A bank holds investments in a custodial account in the client’s name.

• As agriculture is burgeoning on the east bank, the river will shrink even more

Word sense disambiguation: what clues?

Similar phenomena• homophones - read and red

– same pronunciation/different orthography

• homographs - bass and bass – same orthography/different pronunciation

Page 24: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Ambiguity: Which applications will these cause problems for?

A bass, the bank, /red/

General semantic interpretation

Machine translation

Spelling correction

Speech recognition

Text to speech

Information retrieval

Page 25: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Polysemy

Word with multiple but related meanings (same lexeme)

• They rarely serve red meat.

• He served as U.S. ambassador.

• He might have served his time in prison.

What’s the difference between polysemy and homonymy?

Homonymy:

• Distinct, unrelated meanings

• Different etymology? Coincidental similarity?

Page 26: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Polysemy:• Distinct but related meanings

• idea bank, sperm bank, blood bank, bank bank

• How different?– Different subcategorization frames?– Domain specificity?– Can the two candidate senses be conjoined??He served his time and as ambassador to Norway.

For either, practical task:• What are its senses? (related or not)

• How are they related? (polysemy ‘easier’ here)

• How can we distinguish them?

Page 27: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Tropes, or Figures of Speech

Metaphor: one entity is given the attributes of another (tenor/vehicle/ground)

• Life is a bowl of cherries. Don’t take it serious….

• We are the eyelids of defeated caves. ??

Metonymy: one entity used to stand for another (replacive)

• GM killed the Fiero.

• The ham sandwich wants his check.

Both extend existing sense to new meaning

• Metaphor: completely different concept

• Metonymy: related concepts

Page 28: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Synonymy

Substitutability: different lexemes, same meaning• How big is that plane?

• How large is that plane?

• How big are you? Big brother is watching.

What influences substitutability?• Polysemy (large vs. old sense)

• register: He’s really cheap/?parsimonious.

• collocational constraints: roast beef, ?baked beefeconomy fare ?economy price

Page 29: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Finding Synonyms and Collocations Automatically from a Corpus

Synonyms: Identify words appearing frequently in similar contexts

Blast victims were helped by civic-minded passersby.

Few passersby came to the aid of this crime victim.

Collocations: Identify synonyms that don’t appear in some specific similar contexts

Flu victims, flu suffers,…

Crime victims, ?crime sufferers, …

Page 30: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Hyponomy

General: hypernym (super…ordinate)• dog is a hypernym of poodle

Specific: hyponym (under..neath)• poodle is a hyponym of dog

Test: That is a poodle implies that is a dog

Ontology: set of domain objects

Taxonomy? Specification of relations between those objects

Object hierarchy? Structured hierarchy that supports feature inheritance (e.g. poodle inherits some properties of dog)

Page 31: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Semantic Networks

Used to represent lexical relationships• e.g. WordNet (George Miller et al)

• Most widely used hierarchically organized lexical database for English

• Synset: set of synonyms, a dictionary-style definition (or gloss), and some examples of uses --> a concept

• Databases for nouns, verbs, and modifiers

Applications can traverse network to find synonyms, antonyms, hierarchies,...• Available for download or online use

• http://www.cogsci.princeton.edu/~wn

Page 32: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Using WN, e.g. in Question-Answering

Pasca & Harabagiu ’01 results on TREC corpus

• Parses questions to determine question type, key words (Who invented the light bulb?)

• Person question; invent, light, bulb

• The modern world is an electrified world. It might be argued that any of a number of electrical appliances deserves a place on a list of the millennium's most significant inventions. The light bulb, in particular, profoundly changed human existence by illuminating the night and making it hospitable to a wide range of human activity. The electric light, one of the everyday conveniences that most affects our lives, was invented in 1879 simultaneously by Thomas Alva Edison in the United States and Sir Joseph Wilson Swan in England.

Finding named entities is not enough

Page 33: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Compare expected answer ‘type’ to potential answers• For questions of type person, expect answer is person

• Identify potential person names in passages retrieved by IR

• Check in WN to find which of these are hyponyms of person

Or, Consider reformulations of question: Who invented the light bulb• For key words in query, look for WN synonyms

• E.g. Who fabricated the light bulb?

• Use this query for initial IR

Results: improve system accuracy by 147% (on some question types)

Page 34: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Thematic Roles

∃ w,x,y,z {Giving(x) ^ Giver(w,x) ^ Givee(z, x) ^ Given(y,x)}

A set of roles for each event:• Agent: volitional causer -- John hit Bill.

• Experiencer: experiencer of event – Bill got a headache.

• Force: non-volitional causer – The concrete block struck Bill on the head.

• Theme/patient: most affected participant – John hit Bill.

• Result: end product – Bill got a headache.

• Content: proposition of propositional event – Bill thought he should take up martial arts.

Page 35: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

• Instrument: instrument used -- John hit Bill with a bat

• Beneficiary: qui bono – John hit Bill to avenge his friend

• Source: origin of object of transfer event – Bill fled from New York to Timbuktu

• Goal: destination of object -- Bill led from New York to Timbuktu

But there are a lot of verbs, with a lot of frames…

Framenet encoded frames for many verb categories

Page 36: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Thematic Roles and Selectional Restrictions

Selectional restrictions: semantic constraint that a word (lexeme) imposes on the concepts that go with it

George hit Bill with

….John/a gun/gusto.

Jim killed his philodendron/a fly/Bill.

?His philodendron killed Jim.

The flu/Misery killed Jim.

Page 37: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Thematic Roles/Selectional Restrictions

In practical use:

• Given e.g. a verb and a corpus (plus FrameNet)

• What conceptual roles are likely to accompany it?

• What lexemes are likely to fill those roles?

Assassinate

Give

Imagine

Fall

Serve

Page 38: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Schank's Conceptual Dependency

Eleven predicate primitives represent all predicates

Objects decomposed into primitive categories and modifiers

But few predicates result in very complex representations of simple things

∃x,y Atrans(x) ^ Actor(x,John) ^ Object(x,Book) ^ To(x,Mary) ^ Ptrans(y) ^ Actor(y,John) ^ Object(y,Book) ^ To(y,Mary)

John caused Mary to die vs. John killed Mary

Page 39: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Robust Semantics, Information Extraction, and Information Retrieval

Page 40: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Problems with Syntax-Driven Semantics

Compositionality:

• Expects correspondence between syntactic and semantic structures. – Mismatch between syntactic structures and semantic structures: certainly not rule-to-rule.

(inadequacy of CFGs)

I like soup. Soup is what I like.

• Constituent trees contain many structural elements not clearly important to making semantic distinctions– Resort to dependency trees.

• Too abstract: Syntax driven semantic representations are sometimes very abstract.– Nominal Adjective Nominal λx Nominal.sem(x) AM(x,Adj.sem) – Cheap restaurant, Italian restaurant, local restaurant

Robust Semantic processing: Trade-off

• Portability

• Expressivity

Page 41: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Semantic Grammars

Before: • CFG with syntactic categories with

• semantic representation composition overlaid.

Now:• CFG with domain-specific semantic categories

• Domain specific: Rules correspond directly to entities and activities in the domain

I want to go from Boston to Baltimore on Thursday, September 24th

• Greeting {Hello|Hi|Um…}

• TripRequest Need-spec travel-verb from City to City on Date

Note: Semantic grammars are still CFGs.

Page 42: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Pros and Cons of Semantic Grammars

• Semantic grammars encode task knowledge and constrains the range of possible user input.I want to go to Boston on Thursday.I want to leave from there on Friday for Baltimore.TripRequest Need-spec travel-verb from City on Date for City

• The semantic representation is a slot-filler frame-like representation – crafted for that domain.

• Portability: Lack of generality– A new one for each application– Large cost in development time

• Robustness: If users go outside the grammar, things may break disastrouslyI want to go from ah to Boston from Newark

• Expressivity:– I want to go to Boston from Newark or New York

Page 43: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Information Extraction

Another ‘robust’ alternative

Idea: ‘extract’ particular types of information from arbitrary text or transcribed speech

Examples:

• Named entities: people, places, organizations, times, dates– <Organization> MIPS</Organization> Vice President <Person>John

Hime</Person>

• MUC evaluations

Domains: Medical texts, broadcast news (terrorist reports), company mergers, customer care voicemail,...

Page 44: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Appropriate where Semantic Grammars and Syntactic Parsers are Not

Appropriate where information needs very specific and specifiable in advance

• Question answering systems, gisting of news or mail…

• Job ads, financial information, terrorist attacks

Input too complex and far-ranging to build semantic grammars

But full-blown syntactic parsers are impractical

• Too much ambiguity for arbitrary text

• 50 parses or none at all

• Too slow for real-time applications

Page 45: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Information Extraction Techniques

Often use a set of simple templates or frames with slots to be filled in from input text• Ignore everything else

• My number is 212-555-1212.

• The inventor of the wiggleswort was Capt. John T. Hart.

• The king died in March of 1932.

Generative Model:• POS-style HMM model (with novel encoding)

• The/O king/O died/O in/O March/I of/I 1932/I in/O France/O

• T* = argmaxT P(W|T) * P(T)

Context • neighboring words, capitalization, punctuation can be used as

well.

Page 46: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Discriminative Disambiguation Techniques

• Large set of features makes MLE estimation of the parameters unreliable.P(T|W) = π P(ti | W, POS, Ortho)

= P(ti | wi-k…wi+k, posi-k…posi+k, orthoi)• Direct approach:

– F (ti ,wi-k…wi+k, posi-k…posi+k, orthoi) = F(y,X)– F(y,X) =

• Maximum Entropy Markov Models, Conditional Random Fields

),( Xyfii

Yy

Xyf

Xyf

ii

ii

e

eXyP

),(

),(

)|(

Page 47: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

ScanMail Transcription

gender F

age A

caller_name NA

native_speaker N

speech_pathology N

sample_rate 8000

label 0 804672 " [ Greeting: hi R__ ] [ CallerID: it's me ] give me a call [ um ] right away cos there's [ .hn ] I guess there's some [ .hn ] change [ Date: tomorrow ] with the nursery school and they [ um ] [ .hn ] anyway they had this idea [ cos ] since I think J__'s the only one staying [ Date: tomorrow ] for play club so they wanted to they suggested that [ .hn ] well J2__actually offered to take J__home with her and then would she would meet you back at the synagogue at [ Time: five thirty ] to pick her up [ .hn ] [ uh ] so I don't know how you feel about that otherwise Miriam and one other teacher would stay and take care of her till [ Date: five thirty tomorrow ] but if you [ .hn ] I wanted to know how you feel before I tell her one way or the other so call me [ .hn ] right away cos I have to get back to her in about an hour so [ .hn ] okay [ Closing: bye [ .nhn ] [ .onhk ] ]"

duration "50.3 seconds"

Page 48: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

SCANMailSCANMailAccess Access DevicesDevicesPCPC

Pocket PC Pocket PC DataphoneDataphoneVoice PhoneVoice PhoneFlash E-mailFlash E-mail

Page 49: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Word Sense Disambiguation

Word Sense Disambiguation

Page 50: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Disambiguation via Selectional Restrictions

A step toward semantic parsing• Different verbs select for different thematic roles

wash the dishes (takes washable-thing as patient)serve delicious dishes (takes food-type as patient)

Method: rule-to-rule syntactico-semantic analysis• Semantic attachment rules are applied as sentences are

syntactically parsedVP --> V NPV serve <theme> {theme:food-type}

• Selectional restriction violation: no parse

Page 51: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Requires:

• Write selectional restrictions for each sense of each predicate – or use FrameNet– serve alone has 15 verb senses

• Hierarchical type information about each argument (a la WordNet)– How many hypernyms does dish have?– How many lexemes are hyponyms of dish?

But also:

• Sometimes selectional restrictions don’t restrict enough (Which dishes do you like?)

• Sometimes they restrict too much (Eat dirt, worm! I’ll eat my hat!)

Page 52: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Can we take a more statistical approach?

How likely is dish/crockery to be the object of serve? dish/food?

A simple approach (baseline): predict the most likely sense

• Why might this work?

• When will it fail?

A better approach: learn from a tagged corpus

• What needs to be tagged?

An even better approach: Resnik’s selectional association (1997, 1998)

• Estimate conditional probabilities of word senses from a corpus tagged only with verbs and their arguments (e.g. ragout is an object of served -- Jane served/V ragout/Obj

Page 53: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Machine Learning Approaches

Learn a classifier to assign one of possible word senses for each word

• Acquire knowledge from labeled or unlabeled corpus

• Human intervention only in labeling corpus and selecting set of features to use in training

Input: feature vectors

• Target (dependent variable)

• Context (set of independent variables)

Output: classification rules for unseen text

Page 54: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Supervised Learning

Training and test sets with words labeled as to correct sense (It was the biggest [fish: bass] I’ve seen.)

• Obtain values of independent variables automatically (POS, co-occurrence information, …)

• Run classifier on training data

• Test on test data

• Result: Classifier for use on unlabeled data

Page 55: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Input Features for WSD

POS tags of target and neighbors

Surrounding context words (stemmed or not)

Punctuation, capitalization and formatting

Partial parsing to identify thematic/grammatical roles and relations

Collocational information:

• How likely are target and left/right neighbor to co-occur

Co-occurrence of neighboring words

• Intuition: How often does sea or words with bass

Page 56: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

• How do we proceed?– Look at a window around the word to be disambiguated, in training

data– Which features accurately predict the correct tag?– Can you think of other features might be useful in general for WSD?

Input to learner, e.g.

Is the bass fresh today?

[w-2, w-2/pos, w-1,w-/pos,w+1,w+1/pos,w+2,w+2/pos…

[is,V,the,DET,fresh,RB,today,N...

Page 57: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Types of Classifiers

Naïve Bayes

• ŝ = p(s|V), or

• Where s is one of the senses possible and V the input vector of features

• Assume features independent, so probability of V is the product of probabilities of each feature, given s, so

• and p(V) same for any ŝ

• Then

)|1

()|( sn

jv jpsVp

)|1

()(maxargˆ sn

jv jpsp

Sss

)()()|(

maxargVpspsVp

SsmaxargSs

Page 58: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Rule Induction Learners (e.g. Ripper)

Given a feature vector of values for independent variables associated with observations of values for the training set (e.g. [fishing,NP,3,…] + bass2)

Produce a set of rules that perform best on the training data, e.g.

• bass2 if w-1==‘fishing’ & pos==NP

• …

Page 59: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Like case statements applying tests to input in turn

fish within window --> bass1

striped bass --> bass1

guitar within window --> bass2

bass player --> bass1

• Ordering based on individual accuracy on entire training set based on log-likelihood ratio

v jf iSenseP

v jf iSensePLogAbs

|2(

|1((

Decision Lists

Page 60: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Bootstrapping I

• Start with a few labeled instances of target item as seeds to train initial classifier, C

• Use high confidence classifications of C on unlabeled data as training data

• Iterate

Bootstrapping II

• Start with sentences containing words strongly associated with each sense (e.g. sea and music for bass), either intuitively or from corpus or from dictionary entries

• One Sense per Discourse hypothesis

Page 61: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Unsupervised Learning

Cluster feature vectors to ‘discover’ word senses using some similarity metric (e.g. cosine distance)

• Represent each cluster as average of feature vectors it contains

• Label clusters by hand with known senses

• Classify unseen instances by proximity to these known and labeled clusters

Evaluation problem

• What are the ‘right’ senses?

• Cluster impurity

• How do you know how many clusters to create?

• Some clusters may not map to ‘known’ senses

Page 62: Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse

Dictionary Approaches

Problem of scale for all ML approaches

• Build a classifier for each sense ambiguity

Machine readable dictionaries (Lesk ‘86)

• Retrieve all definitions of content words occurring in context of target (e.g. the happy seafarer ate the bass)

• Compare for overlap with sense definitions of target entry (bass2: a type of fish that lives in the sea)

• Choose sense with most overlap

Limits: Entries are short --> expand entries to ‘related’ words