15
Dependency Parsing Hung-yi Lee 李宏毅

Dependency Parsing - 國立臺灣大學

  • Upload
    others

  • View
    15

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Dependency Parsing - 國立臺灣大學

Dependency ParsingHung-yi Lee 李宏毅

Page 2: Dependency Parsing - 國立臺灣大學

One Sequence Multiple Sequences

One Class

Sentiment ClassificationStance Detection

Veracity PredictionIntent Classification

Dialogue Policy

NLISearch Engine

Relation Extraction

Class for each Token

POS taggingWord segmentation

Extractive SummarizationSlotting Filling

NER

Copyfrom Input

Extractive QA

GeneralSequence

Abstractive SummarizationTranslation

Grammar CorrectionNLG

General QAChatbot

State TrackerTask Oriented Dialogue

Other? Parsing, Coreference Resolution

Page 3: Dependency Parsing - 國立臺灣大學

We book her the flight to Taipeihead dependent

deep learning is very powerful

constituent constituent

not constituent constituentConstituency Parsing

Dependency Parsing

Page 4: Dependency Parsing - 國立臺灣大學
Page 5: Dependency Parsing - 國立臺灣大學

Dependency Parsing

a

I

want

to

study

PhD

Directed graph

node → word

edge → relationI want to study a PhD

det

objmark

xcompnsubj

ROOT• All the words have one

incoming edge, except ROOT.

• There is a unique path from each word to ROOT.

Tree

Page 6: Dependency Parsing - 國立臺灣大學

…… w2 w3 w4 w5 ……span

Constituent?

binary classification

Which label?

multi-classclassification

Classifier

…… wl …….. wr ……

wl→ wr

binary classification

Which relation?

multi-classclassification

Classifier

Constituency Parsing Dependency Parsing

given two words

Page 7: Dependency Parsing - 國立臺灣大學

Graph-based

I want to study a PhDROOT

N words

Run the classifier at most (N+1)2 times

Classifier Classifier

wl→ wr wl→ wr

YES NO

Page 8: Dependency Parsing - 國立臺灣大學

Graph-based[Dozat, et al., ICLR’17]

[Dozat, et al., ACL’18]

Page 9: Dependency Parsing - 國立臺灣大學

Graph-based

w1

Classifier

wl→ wr

YES

w2 w3…… …… …… ……

Classifier

wl→ wr

YES

Contradiction!

Page 10: Dependency Parsing - 國立臺灣大學

Maximum Spanning Tree

ROOT

w1 w2

0.9 0.2

0.7

0.3

ROOT

w1 w2

0.2

0.3

ROOT

w1 w2

0.9

0.7

Page 11: Dependency Parsing - 國立臺灣大學

Transition-based Approach

A stack, a buffer, some actions ……

We have learned similar approaches when talking about constituency parsing.

Page 12: Dependency Parsing - 國立臺灣大學

Transition-based Approach

[Dyer, et al., ACL’15]

[Chen, et al., EMNLP’14]

Page 13: Dependency Parsing - 國立臺灣大學

SyntaxNet [Andor, et al., ACL’16]

https://ai.googleblog.com/2016/05/announcing-syntaxnet-worlds-most.html

Page 14: Dependency Parsing - 國立臺灣大學

Stack Pointer

[Ma, et al., ACL’18]

Page 15: Dependency Parsing - 國立臺灣大學

Reference

• Danqi Chen, Christopher D. Manning, A Fast and Accurate Dependency Parser using Neural Networks, EMNLP, 2014

• Chris Dyer, Miguel Ballesteros, Wang Ling, Austin Matthews, Noah A. Smith, Transition-Based Dependency Parsing with Stack Long Short-Term Memory, ACL, 2015

• Daniel Andor, Chris Alberti, David Weiss, Aliaksei Severyn, Alessandro Presta, Kuzman Ganchev, Slav Petrov and Michael Collins, Globally Normalized Transition-Based Neural Networks, ACL, 2016

• Timothy Dozat, Christopher D. Manning, Deep Biaffine Attention for Neural Dependency Parsing, ICLR, 2017

• Timothy Dozat, Christopher D. Manning, Simpler but More Accurate Semantic Dependency Parsing, ACL, 2018

• Xuezhe Ma, Zecong Hu, Jingzhou Liu, Nanyun Peng, Graham Neubig, Eduard Hovy, Stack-Pointer Networks for Dependency Parsing, ACL, 2018