Ch3-N-Grams

Ch3-N-Grams - Search and Decoding in Speech Recognition...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
Search and Decoding in Speech  Recognition N-Grams
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2/13/12 Veton Këpuska 2 N-Grams u Problem of word prediction. u Example: “I’d like to make a collect …” n Very likely words:  u “call”, u “international call”, or u “phone call”, and NOT u “the”. u The idea of  word prediction  is formalized with probabilistic models called  N- grams . n N-grams – predict the next word from previous N-1 words. n Statistical models of word sequences are also called  language models  or  LM s. u Computing probability of the next word will turn out to be closely related to  computing the probability of a sequence of words. u Example:  n “… all of a sudden I notice three guys standing on the sidewalk …”, vs. n “… on guys all I of notice sidewalk three a sudden standing the …”
Background image of page 2
2/13/12 Veton Këpuska 3 N-grams u Estimators like N-grams that assign a conditional probability to  possible next words can be used to assign a joint probability  to an entire sentence. u N-gram models are one of the most important tools in speech  and language processing. u N-grams are essential in any tasks in which the words must  be identified from ambiguous and noisy inputs.
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
N-Gram Application Areas u Speech Recognition  – the input speech  sounds are very confusable and many  words sound extremely similar. 2/13/12 Veton Këpuska 4
Background image of page 4
2/13/12 Veton Këpuska 5 N-Gram Application Areas u Handwriting (OCR)  Recognition  – probabilities of word  sequences help in recognition. n Woody Allen in his movie “Take the Money and Run”, tries to  rob a bank with a sloppily written hold-up note that the teller  incorrectly reads as “I have a gub”. n Any speech and language processing system could avoid  making this mistake by using the knowledge that the  sequence “I have a gun” is far more probable than the non- word “I have a gub” or even “I have a gull”.  
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2/13/12 Veton Këpuska 6 N-Gram Application Areas u Statistical  Machine Translation  – Example of translation of  a Chinese source sentence: from a set of potential rough English translations: n he briefed to reporters on the chief contents of the statement n he briefed reporters on the chief contents of the statement n he briefed to reporters on the main contents of the statement n he briefed reporters on the main contents of the  statement
Background image of page 6
2/13/12 Veton Këpuska 7 N-Gram Application Areas u An N-gram grammar might tell us that   briefed reporters  is more  likely than  briefed to reporters , and  main contents  is more  likely than  chief contents . u
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 02/11/2012 for the course ECE 5527 taught by Professor Staff during the Fall '11 term at FIT.

Page1 / 201

Ch3-N-Grams - Search and Decoding in Speech Recognition...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online