17
2007.05.01 - SLIDE 1 IS 240 – Spring 2007 Prof. Ray Larson University of California, Berkeley School of Information Tuesday and Thursday 10:30 am - 12:00 pm Spring 2007 http://courses.ischool.berkeley.edu/i240/s07 Principles of Information Retrieval Lecture 25: More NLP and IE

Lecture 25: More NLP and IE

Embed Size (px)

DESCRIPTION

Prof. Ray Larson University of California, Berkeley School of Information Tuesday and Thursday 10:30 am - 12:00 pm Spring 2007 http://courses.ischool.berkeley.edu/i240/s07. Lecture 25: More NLP and IE. Principles of Information Retrieval. Today. Review NLP for IR Text Summarization - PowerPoint PPT Presentation

Citation preview

Page 1: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 1IS 240 – Spring 2007

Prof. Ray Larson University of California, Berkeley

School of InformationTuesday and Thursday 10:30 am - 12:00 pm

Spring 2007http://courses.ischool.berkeley.edu/i240/s07

Principles of Information Retrieval

Lecture 25: More NLP and IE

Page 2: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 2IS 240 – Spring 2007

Today

• Review– NLP for IR– Text Summarization

• Cross-Language Information Retrieval– Introduction– Cross-Language EVIs

Credit for some of the material in this lecture goes to Doug Oard (University of Maryland)and to Fredric Gey and Aitao Chen

Page 3: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 3IS 240 – Spring 2007

Today

• Review– NLP for IR

• More on NLP and Information Extraction – From Christopher Manning (Stanford)

“Opportunities in Natural Language Processing”

Page 4: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 4IS 240 – Spring 2007

Natural Language Processing and IR

• The main approach in applying NLP to IR has been to attempt to address

– Phrase usage vs individual terms

– Search expansion using related terms/concepts

– Attempts to automatically exploit or assign controlled vocabularies

Page 5: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 5IS 240 – Spring 2007

NLP and IR

• Much early research showed that (at least in the restricted test databases tested)– Indexing documents by individual terms

corresponding to words and word stems produces retrieval results at least as good as when indexes use controlled vocabularies (whether applied manually or automatically)

– Constructing phrases or “pre-coordinated” terms provides only marginal and inconsistent improvements

Page 6: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 6IS 240 – Spring 2007

NLP and IR

• Not clear why intuitively plausible improvements to document representation have had little effect on retrieval results when compared to statistical methods– E.g. Use of syntactic role relations between

terms has shown no improvement in performance over “bag of words” approaches

Page 7: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 7IS 240 – Spring 2007

General Framework of NLP

Morphological andLexical Processing

Syntactic Analysis

Semantic Analysis

Context processingInterpretation

John runs.

John run+s. P-N V 3-pre N plu

S

NP

P-N

John

VP

V

runPred: RUN Agent:John

John is a student.He runs. Slide from Prof. J. Tsujii, Univ of Tokyo and Univ of Manchester

Page 8: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 8IS 240 – Spring 2007

Using NLP

• Strzalkowski (in Reader)

Text NLP represDbasesearch

TAGGERNLP: PARSER TERMS

Page 9: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 9IS 240 – Spring 2007

Using NLP

INPUT SENTENCEThe former Soviet President has been a local hero ever sincea Russian tank invaded Wisconsin.

TAGGED SENTENCEThe/dt former/jj Soviet/jj President/nn has/vbz been/vbn a/dt local/jj hero/nn ever/rb since/in a/dt Russian/jj tank/nn invaded/vbd Wisconsin/np ./per

Page 10: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 10IS 240 – Spring 2007

Using NLP

TAGGED & STEMMED SENTENCEthe/dt former/jj soviet/jj president/nn have/vbz be/vbn a/dt local/jj hero/nn ever/rb since/in a/dt russian/jj tank/nn invade/vbd wisconsin/np ./per

Page 11: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 11IS 240 – Spring 2007

Using NLP

PARSED SENTENCE

[assert

[[perf [have]][[verb[BE]]

[subject [np[n PRESIDENT][t_pos THE]

[adj[FORMER]][adj[SOVIET]]]]

[adv EVER]

[sub_ord[SINCE [[verb[INVADE]]

[subject [np [n TANK][t_pos A]

[adj [RUSSIAN]]]]

[object [np [name [WISCONSIN]]]]]]]]]

Page 12: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 12IS 240 – Spring 2007

Using NLP

EXTRACTED TERMS & WEIGHTS

President 2.623519 soviet 5.416102

President+soviet 11.556747 president+former 14.594883

Hero 7.896426 hero+local 14.314775

Invade 8.435012 tank 6.848128

Tank+invade 17.402237 tank+russian 16.030809

Russian 7.383342 wisconsin 7.785689

Page 13: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 13IS 240 – Spring 2007

NLP & IR

• Indexing– Use of NLP methods to identify phrases

• Test weighting schemes for phrases

– Use of more sophisticated morphological analysis

• Searching– Use of two-stage retrieval

• Statistical retrieval• Followed by more sophisticated NLP filtering

Page 14: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 14IS 240 – Spring 2007

NLP & IR

• New “Question Answering” track at TREC has been exploring these areas– Usually statistical methods are used to

retrieve candidate documents– NLP techniques are used to extract the likely

answers from the text of the documents

Page 15: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 15IS 240 – Spring 2007

Mark’s idle speculation

• What people think is going on always

Keywords

NLPFrom Mark Sanderson, University of Sheffield

Page 16: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 16IS 240 – Spring 2007

Mark’s idle speculation

• What’s usually actually going on

KeywordsNLPFrom Mark Sanderson, University of Sheffield

Page 17: Lecture 25: More NLP and IE

2007.05.01 - SLIDE 17IS 240 – Spring 2007

Additional Slides on NLP and IE• From Christopher Manning, Stanford