jurafsky&martin_3rdEd_17 (1).pdf

In chapter 11 the probability of upcoming words can

Info icon This preview shows pages 56–59. Sign up to view the full content.

View Full Document Right Arrow Icon
in Chapter 11, the probability of upcoming words can be dependent on events that were arbitrarily distant and time dependent. Thus, our statistical models only give an approximation to the correct distributions and entropies of natural language. To summarize, by making some incorrect but convenient simplifying assump- tions, we can compute the entropy of some stochastic process by taking a very long sample of the output and computing its average log probability. Now we are ready to introduce cross-entropy . The cross-entropy is useful when cross-entropy we don’t know the actual probability distribution p that generated some data. It
Image of page 56

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
4.7 A DVANCED : P ERPLEXITY S R ELATION TO E NTROPY 57 allows us to use some m , which is a model of p (i.e., an approximation to p ). The cross-entropy of m on p is defined by H ( p , m ) = lim n ! - 1 n X W 2 L p ( w 1 ,..., w n ) log m ( w 1 ,..., w n ) (4.46) That is, we draw sequences according to the probability distribution p , but sum the log of their probabilities according to m . Again, following the Shannon-McMillan-Breiman theorem, for a stationary er- godic process: H ( p , m ) = lim n ! - 1 n log m ( w 1 w 2 ... w n ) (4.47) This means that, as for entropy, we can estimate the cross-entropy of a model m on some distribution p by taking a single sequence that is long enough instead of summing over all possible sequences. What makes the cross-entropy useful is that the cross-entropy H ( p , m ) is an up- per bound on the entropy H ( p ) . For any model m : H ( p ) H ( p , m ) (4.48) This means that we can use some simplified model m to help estimate the true en- tropy of a sequence of symbols drawn according to probability p . The more accurate m is, the closer the cross-entropy H ( p , m ) will be to the true entropy H ( p ) . Thus, the difference between H ( p , m ) and H ( p ) is a measure of how accurate a model is. Between two models m 1 and m 2 , the more accurate model will be the one with the lower cross-entropy. (The cross-entropy can never be lower than the true entropy, so a model cannot err by underestimating the true entropy.) We are finally ready to see the relation between perplexity and cross-entropy as we saw it in Eq. 4.47 . Cross-entropy is defined in the limit, as the length of the observed word sequence goes to infinity. We will need an approximation to cross- entropy, relying on a (sufficiently long) sequence of fixed length. This approxima- tion to the cross-entropy of a model M = P ( w i | w i - N + 1 ... w i - 1 ) on a sequence of words W is H ( W ) = - 1 N log P ( w 1 w 2 ... w N ) (4.49) The perplexity of a model P on a sequence of words W is now formally defined as perplexity the exp of this cross-entropy: Perplexity ( W ) = 2 H ( W ) = P ( w 1 w 2 ... w N ) - 1 N = N s 1 P ( w 1 w 2 ... w N ) = N v u u t N Y i = 1 1 P ( w i | w 1 ... w i - 1 ) (4.50)
Image of page 57
58 C HAPTER 4 L ANGUAGE M ODELING WITH N- GRAMS 4.8 Summary This chapter introduced language modeling and the N-gram, one of the most widely used tools in language processing.
Image of page 58

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 59
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern