lect35-future - Current & Future NLP Research A Few...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon
600.465 - Intro to NLP - J. Eisner 1 Current & Future NLP Research A Few Random Remarks
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
600.465 - Intro to NLP - J. Eisner 2 Computational Linguistics We can study anything about language . .. 1. Formalize some insights 2. Study the formalism mathematically 4. Test on real data
Background image of page 2
600.465 - Intro to NLP - J. Eisner 3 Reprise from Lecture 1: What’s hard about this story? These ambiguities now look familiar You now know how to solve some  (e.g., conditional log-linear models) : PP attachment Coreference resolution (which NP does “it” refer to?) Word sense disambiguation Hardest part: How many senses?  What are they? Others still seem beyond the state of the art  (except in limited settings) : Anything that requires much semantics or reasoning Quantifier scope Reasoning about John’s beliefs and actions “Deep” meaning of words and relations John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there.
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
600.465 - Intro to NLP - J. Eisner 4 Deep NLP Requires World Knowledge The pen is in the box. The box is in the pen. The police watched the demonstrators because they feared violence. The police watched the demonstrators because they advocated violence. Mary and Sue are sisters. Mary and Sue are mothers. Every American has a mother. Every American has a president. John saw his brother skiing on TV.  The fool … didn’t have a coat on! … didn’t recognize him! George Burns: My aunt is in the hospital.    I went to see her today, and took her flowers. Gracie Allen: George, that’s terrible!   examples mostly from Terry Winograd in the 1970’s, via Doug Lenat
Background image of page 4
600.465 - Intro to NLP - J. Eisner 5 Big Questions of CL What  formalisms  can encode various kinds of linguistic knowledge? Discrete knowledge:  what is possible? Continuous knowledge:  what is likely? What kind of p(…) to use (e.g., a PCFG)? What is the prior over the structure (set of rules) and parameters (rule weights)? How to combine different kinds of knowledge, including world knowledge? How can we  compute efficiently  within these formalisms? Or find approximations that work pretty well? Problem 1:  Prediction in a given model.  Problem 2:  Learning the model. How should we  learn  within a given formalism? Hard with unsupervised, semi-supervised, heterogeneous data …
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 6
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 02/13/2011 for the course ECONOMY 0132436892 taught by Professor Natalia during the Spring '10 term at Yeditepe Üniversitesi.

Page1 / 33

lect35-future - Current & Future NLP Research A Few...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online