Lecture12-lmdiscount

Lecture12-lmdiscount - This work is licensed under a...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
1 CS 479, section 1: Natural Language Processing Lectures #12: Language Model Smoothing: Discounting Thanks to Dan Klein of UC Berkeley for many of the materials used in this lecture. This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License . Announcements Reading Report #5 Due: now Project #1, Part 1 Early: Wednesday Due: Friday Questions? Objectives Get comfortable with the process of factoring and smoothing a joint model of a familiar object: text! Look closely at discounting techniques for smoothing language models. Review: Language Models Is a LM (i.e., Markov Model) a joint or conditional distribution? Over what? Is a local model a joint or conditional distribution? Over what? How are the LM and the local model related? Pሺݏ݁݊ݐ݁݊ܿ݁ሻ ൌ P ݓ …ݓ P ݓ ݓ ௜ିଵ ௜ି௠ P ݓ ൌ ෑ P ݓ ݓ ௜ିଵ ௜ି௠ ௜ୀଵ Training To train a language model in our framework, how should you go about it? Hand sentences to constructor for BasicMarkovModel Extract n grams BMM should in turn hand n gram data to constructor for local model learner If local model is an interpolated model, hand data to learners for its sub
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 4

Lecture12-lmdiscount - This work is licensed under a...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online