22
Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Semantic Annotation Evaluation and Utility

Bonnie DorrSaif MohammadDavid Yarowsky

Keith Hall

Page 2: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Road Map• Project Organization• Semantic Annotation and Utility Evaluation

Workshop• Focus Area: Informal Input

– Belief/Opinion/Confidence (modality)– Dialog Acts– Complex Coreference (e.g., events)– Temporal relations

• Interoperability• Current and Future Work

Page 3: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Project Organization

Columbia (Rambow, Passonneau)Dialogic ContentCommitted Belief

CMU (Mitamura, Levin, Nyberg)CoreferenceEntity relations Committed Belief

BBN (Ramshaw, Habash)Temporal AnnotationCoreference (complex)

Affiliated EffortsEd HovyMartha PalmerGeorge Wilson (Mitre)

UMBC (Nirenburg, McShane)Modality: polarity, epistemic, belief, deontic, volitive, potential, permissive, evaluative

EvaluationBonnie DorrDavid YarowskyKeith HallSaif Mohammad

Page 4: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Semantic Annotation & Utility Evaluation Meeting: Feb 14th

• Site presentations included an overview of the phenomena covered and utility-motivating examples, extracted from the target corpus.

• Collective assessment of what additional capabilities could be achieved if a machine could achieve near human-performance on annotation of these meaning layers relative to applications operating on text without such meaning layer analysis.

• Compatibility, Interoperability, integration into larger KB environment.

• How can we automate these processes?

Page 5: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Attendees

• Kathy Baker (DoD)• Mona Diab (Columbia)• Bonnie Dorr (UMD)• Tim Finin (JHU/APL)• Nizar Habash (Columbia)• Keith Hall (JHU)• Eduard Hovy (USC/ISI)• Lori Levin (CMU)• James Mayfield (JHU/APL)• Teruko Mitamura (CMU)• Saif Mohammad (UMD)• Smaranda Muresan (UMD)

• Sergei Nirenburg (UMBC)• Eric Nyberg (CMU)• Doug Oard (UMD)• Boyan Onyshkevych (DoD)• Martha Palmer (Colorado)• Rebecca Passonneau (Columbia)• Owen Rambow (Columbia)• Lance Ramshaw (BBN)• Clare Voss (ARL)• Ralph Weischedel (BBN)• George Wilson (Mitre)• David Yarowsky (JHU)

Page 6: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Analysis of Informal Input: Unifies Majority of Annotation Themes

• Four relevant representational Layers: – Belief/Opinion/Confidence (modality)– Dialog Acts– Coreference (entities and events)– Temporal relations

• Many relevant applications:– KB population– Social Network Analysis– Sentiment analysis– Deception detection– Text mining– Question answering– Information retrieval– Summarization

• Analysis of informal input is dynamic: a first analysis may be refined when subsequent informal input contributions are processed

Page 7: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Representational Layer 1: Committed Belief• Committed belief: Speaker indicates in this utterance

that Speaker believes the proposition– I know Afghanistan and Pakistan have provided the richest

opportunity for Al Qaeda to take root.• Non-committed belief: Speaker identifies the

proposition as something which Speaker could believe, but Speaker happens not to have a strong belief in the proposition– Afghanistan and Pakistan may have provided the richest

opportunity for Al Qaeda to take root.• No asserted belief: for Speaker, the proposition is not

of type in which Speaker is expressing a belief, or could express a belief. Usually, this is because the proposition does not have a truth value in this world. – Did Afghanistan and Pakistan provide the richest opportunity for

Al Qaeda to take root?

Page 8: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Committed Belief is not Factivity

CB = committed belief, NA = No asserted belief• Committed-belief annotation and factivity annotation are

complementary• NA cases may lead to detection of current and future threats,

sometimes conditional. Multiple modalities (opinion detection):– Potential: “Smith might be assassinated — if he is in power.”– Obligative: “Smith should be assassinated.”

Fact Opinion

CB Smith was assassinated.

Smith was a nasty dictator.

NA Smith will be assassinated.

Smith will become a nasty dictator (once he is in power).

Page 9: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Committed Belief is not Tense

CB = committed belief, NA = No asserted belief• Special feature to indicate future tense on CB (committed

belief) and NCB (non-committed belief)

Past Future

CB Smith was assassinated.

Smith will be assassinated tomorrow.

NA I hope Smith regretted his acts.

I hope Smith will regret his acts.

Page 10: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Why Is RecognizingCommitted Belief Important?

• Committed-Belief Annotation Distinguishes – Propositions that are asserted as true (CB)– Propositions that are asserted but speculative (NCB)– Propositions that are not asserted at all (NA)

• Important whenever we need to identify facts– IR Query: show documents discussing instances of peasants

being robbed of their land• Document found 1: The people robbing Iraqi peasants of their land

should be punished RELEVANT: YES• Document found 2: Robbing Iraqi peasants of their land would be

bad. RELEVANT: NO– QA: Did the humanitarian crisis in Iraq end?

• Text found 1: He arrived on Tuesday, bringing an end to the humanitarian crisis in Iraq. ANS: YES.

• Text found 2: He arrived on Tuesday, calling for an end to the humanitarian crisis in Iraq. ANS: I DON’T KNOW

Page 11: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

• INFORM• REQUEST-INFORMATION• REQUEST-ACTION• COMMIT• ACCEPT• REJECT• BACKCHANNEL• PERFORM• CONVENTIONAL

Representational Layer 2: Dialog Acts

Page 12: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Why is dialog analysis important?• Understanding the outcome of an interaction

– What is the outcome?– Who prevailed?– Why (status of interactants, priority of communicative

action)?

• Application of a common architecture to automatic analysis of interaction in email, blogs, phone conversations, . . .

• Social Network Analysis: Is the speaker/sender in an inferior position to the hearer/receiver?– How can we know? (e.g., REJECT a REQUEST)

Page 13: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Annotate events beyond ACE coreference definition– ACE does not identify Events as coreferents when

one mention refers only to a part of the other– In ACE, the plural event mention is not coreferent

with mentions of the component individual events.– ACE does not annotate:

“Three people have been convicted…Smith and Jones were found guilty of selling guns…”

“The gunman shot Smith and his son. ..The attack against Smith.”

Representational Layer 3: Complex Coreference (e.g., events)

Page 14: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Related Events (and sub-events)• Events that happened

“Britain bombed Iraq last night.”• Events which did not happen

“Hall did not speak about the bombings.”• Planned events

planned, expected to happen, agree to do… “Hall planned to meet with Saddam.”

• Sub-Event Examples:– “drug war” (contains subevents: attacks, crackdowns,

bullying…) – “attacks” (contains subevents: deaths, kidnappings,

assassination, bombed…)

Page 15: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Why is complex coreference resolution important?

• Complex Question Answering:– Event questions: Describe the drug war

events in Latin America. – List questions: List the events related to

attacks in the drug war. – Relationship questions: Who is attacking

who?

Page 16: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Baghdad 11/28 -- Senator Hall arrived in Baghdad yesterday. He told reporters that he “ will not be visiting Tehran” before he left Washington. He will return next Monday.

TimeUnit Type Relation Parent11/28 Specific.Date After arrivedarrived Past.Event Before <writer>yesterday Past.Date Concurrent arrivedtold Past.Say Before arrivedvisiting Neg.Future.Event After toldleft Past.Event After toldreturn Future.Event After <writer>Monday Specific.Date Concurrent return

Representational Layer 4: Temporal Relations

Page 17: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Temporal Relation Parse

11/28

arrived

yesterdaytold

(not) visiting

left

return

Monday

<writer>

TIME

Page 18: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Temporal Relation Analysis:Inter-annotator AgreementTemporal Type Matches Clashes Agreement

410_nyt 30 1 96.8%419_apw 28 0 100.0%602CZ 34 3 91.9%ENRON 12 2 85.7%Total 104 6 94.5%

Temp Relations Exact Match Partial Mat Clash Exact Agree Part Agree410_nyt 23 3 1 85.2% 96.3%419_apw 24 3 0 88.9% 100.0%602CZ 23 2 1 88.5% 96.2%ENRON 11 1 1 84.6% 92.3%Total 81 9 3 87.1% 96.8%

Parent Pointers Matches Clashes Agreement410_nyt 27 4 87.1%419_apw 27 1 96.4%602CZ 26 11 70.3%ENRON 13 1 92.9%Total 93 17 84.5%

Page 19: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Why is Temporal Analysis Important?

• Constructing activity schedules from text

• Question answering (temporal): did/does/will X happen before/after/same_time_with Y?

where X,Y are events, states, dates or time ranges.

Page 20: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Interoperability: Data

• Common data model

• Multiple implementations– based on the same underlying schema

(formal object model)– meet different goals / requirements

• Implementation Criteria:– Support effective run-time annotation– Support effective user interface, query/update– Support on-the-fly schema extension

Page 21: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

21

Example: UMBC Modality Annotations

Page 22: Semantic Annotation Evaluation and Utility Bonnie Dorr Saif Mohammad David Yarowsky Keith Hall

Ongoing and Future work• Move to new genre—informal input.• Establish compatibility across levels.• Continue examining intra-site and cross-site

annotation agreement rates• Initial assessment of computational feasibility of

machine learning approaches—“our annotations are supposed to be fodder for ML approaches.”

• Implementation of framework for superimposing semantic “layers” on existing objects (e.g., on top of ACE types).

• Move to multiple languages.