Ch6-Formal Grammars of English

Ch6-Formal Grammars of English - Search and Decoding in...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Search and Decoding in Speech Recognition Formal Grammars of English Formal Grammars of English Oldest Grammar written 2000 years ago by Panini for Sanskrit language. Geoff Pullum noted in a recent talk that "almost everything most educated Americans believe about English grammar is wrong". February 11, 2012 Veton Kpuska 2 Formal Grammars of English Syntax: Syntactic notions discussed previously: Syntaxis Old Greek means "setting out together or arrangement" it refers to the way the words are arranged together. Regular Languages Computation of Probabilities of those representations of Regular Languages. Introduction of sophisticated notions of syntax and grammar that go beyond these simpler notions: Constituency Grammatical relations Subcategorization and dependency. February 11, 2012 Veton Kpuska 3 Constituency Groups of words may behave as a single unit or phrase called constituent. Example: noun phrase often acting as a unit Single words: She, or Phrases: Michael The house Russian Hill, and A wellweathered threestory structure. Contextfree grammars a formalism that will allow us to model these constituency facts. Veton Kpuska 4 February 11, 2012 Grammatical Relations Formalization of ideas from traditional grammar such as SUBJECTS and OBJECTS, and other related notions. In the following sentence the noun phrase She is the SUBJECT and a mammoth breakfast is the OBJECT: She ate a mammoth breakfast. February 11, 2012 Veton Kpuska 5 Subcategorization and Dependency Relations Subcategorization and dependency relations refer to certain kinds of relations between words and phrases. For example the verb want can be followed by an infinitive, as in: I want to fly to Detroit, or a noun phrase, as in I want a flight to Detroit. But the verb find cannot be followed by an infinitive *I found to fly to Dallas. These are called facts about the subcategorization of the verb. February 11, 2012 Veton Kpuska 6 ContextFreeGrammars As we'll see, none of the syntactic mechanisms that we've discussed up until now can easily capture such phenomena. They can be modeled much more naturally by grammars that are based on contextfree grammars. Contextfree grammars are thus the backbone of many formal models of the syntax of natural language (and, for that matter, of computer languages). As such they are integral to many computational applications including grammar checking, semantic interpretation, dialogue understanding, and machine translation. February 11, 2012 Veton Kpuska 7 They are powerful enough to express sophisticated relations among the words in a sentence, yet computationally tractable enough that efficient algorithms exist for parsing sentences with them (as described in Ch. 13 of the text book). Later in Ch. 14 of the text book shows that adding probability to contextfree grammars gives us a model of disambiguation, and also helps model certain aspects of human parsing. February 11, 2012 Veton Kpuska 8 Context Free Grammars In addition to an introduction to the grammar formalism, this chapter also provides an brief overview of the grammar of English. We have chosen a domain which has relatively simple sentences, the Air Traffic Information System (ATIS) domain (Hemphill et al., 1990). ATIS systems are an early example of spoken language systems for helping book airline reservations. Users try to book flights by conversing with the system, specifying constraints like I'd like to fly from Atlanta to Denver. The U.S. government funded a number of different research sites to collect data and build ATIS systems in the early 1990s. The sentences we will be modeling in this chapter are drawn from the corpus of user queries to the system. February 11, 2012 Veton Kpuska 9 Constituency Constituency How do words group together in English? Consider the noun phrase, a sequence of words surrounding at least one noun. Here are some examples of noun phrases (thanks to Damon Runyon): a highclass spot such as Mindy's the reason he comes into the Hot Box three parties from Brooklyn Harry the Horse the Broadway coppers they How do we know that these words group together or "form constituents"? Veton Kpuska 11 February 11, 2012 Constituency One piece of evidence is that they can all appear in similar syntactic environments, for example before a verb. three parties from Brooklyn arrive. . . a highclass spot such as Mindy's attracts. . . the Broadway coppers love. . . they sit But while the whole noun phrase can occur before a verb, this is not true of each of the individual words that make up a noun phrase. The following are not grammatical sentences of English (recall that we use an asterisk (*) to mark fragments that are not grammatical English sentences): February 11, 2012 Veton Kpuska 12 February 11, 2012 Veton Kpuska 13 Constituency *from arrive ... *the is ... *as attracts ... *spot is ... Thus to correctly describe facts about the ordering of these words in English, we must be able to say things like "Noun Phrases can occur before verbs". February 11, 2012 Veton Kpuska 14 Constituency Other kinds of evidence for constituency come from what are called preposed or postposed constructions. For example, the prepositional phrase: on September seventeenth can be placed in a number of different locations in the following examples, including preposed at the beginning, and postposed at the end: On September seventeenth, I'd like to fly from Atlanta to Denver I'd like to fly on September seventeenth from Atlanta to Denver I'd like to fly from Atlanta to Denver on September seventeenth February 11, 2012 Veton Kpuska 15 Constituency But again, while the entire phrase can be placed differently, the individual words making up the phrase cannot be: *On September, I'd like to fly seventeenth from Atlanta to Denver *On I'd like to fly September seventeenth from Atlanta to Denver *I'd like to fly on September from Atlanta to Denver seventeenth February 11, 2012 Veton Kpuska 16 ContextFree Grammars ContextFree Grammars The most commonly used mathematical system for modeling constituent structure in English and other natural languages is the ContextFree Grammar, or CFG. Context free grammars are also called PhraseStructure Grammars, and the formalism is equivalent to what is also called BackusNaur Form or BNF. The idea of basing a grammar on constituent structure dates back to the psychologist Wilhelm Wundt (1900), but was not formalized until Chomsky (1956) and, independently, Backus (1959). February 11, 2012 Veton Kpuska 18 ContextFree Grammars A contextfree grammar consists of a set of rules or productions, each of which expresses the ways that symbols of the language can be grouped and ordered together, and a lexicon of words and symbols. For example, the following productions express that a NP (or noun phrase), can be composed of either a ProperNoun or a determiner (Det) followed by a Nominal; a Nominal can be one or more Nouns. NP NP Nominal Det Nominal ProperNoun Noun | Nominal Noun February 11, 2012 Veton Kpuska 19 ContextFree Grammars Contextfree rules can be hierarchically embedded, so we can combine the previous rules with others like the following which express facts about the lexicon: Det a Det the Noun flight The symbols that are used in a CFG are divided into two classes. 1. 2. The symbols that correspond to words in the language ("the", "nightclub") are called terminal symbols; the lexicon is the set of rules that introduce these terminal symbols. The symbols that express clusters or generalizations of these are called nonterminals. February 11, 2012 Veton Kpuska 20 ContextFree Grammars In each context free rule: the item to the right of the arrow () is an ordered list of one or more terminals and nonterminals, while to the left of the arrow is a single nonterminal symbol expressing some cluster or generalization. Notice that in the lexicon, the nonterminal associated with each word is its lexical category, or partofspeech, which is defined in Ch. 5 of the text book. February 11, 2012 Veton Kpuska 21 Context Free Grammar A CFG can be thought of in two ways: 1. as a device for generating sentences, and 2. as a device for assigning a structure to a given sentence. We saw this same dualism in our discussion of finitestate transducers in Ch. 3. As a generator, we can read the arrow as "rewrite the symbol on the left with the string of symbols on the right". February 11, 2012 Veton Kpuska 22 ContextFree Grammars We say the string a flight can be derived from the nonterminal NP. Thus a CFG can be used to generate a set of strings. This sequence of rule expansions is called a derivation of the string of words. It is common to represent a derivation by a parse tree (commonly shown inverted with the root at the top). Fig. 12.1 shows the tree representation of this derivation. NP Det a Nominal Noun flight February 11, 2012 Veton Kpuska 23 In the parse tree shown in previous slide we say that the node NP immediately dominates the node Det and the node Nom. We say that the node NP dominates all the nodes in the tree (Det, Nom, Noun, a, flight). The formal language defined by a CFG is the set of strings that are derivable from the designated start symbol. Each grammar must have one designated start symbol which is often called S. Since contextfree grammars are often used to define sentences, S is usually interpreted as the "sentence" node, and the set of strings that are derivable from S is the set of sentences in some simplified version of English. February 11, 2012 Veton Kpuska 24 Let's add to our list of rules a few higherlevel rules that expand S, and a couple of others. One will express the fact that a sentence can consist of a noun phrase followed by a verb phrase: S NP VP I prefer a morning flight A verb phrase in English consists of a verb followed by assorted other things; for example, one kind of verb phrase consists of a verb followed by a noun phrase: VP Verb NP prefer a morning flight February 11, 2012 Veton Kpuska 25 ContextFree Grammars Or the verb phrase may have a verb followed by a noun phrase and a prepositional phrase: VP Verb NP PP leave Boston in the morning Or the verb may be followed by a prepositional phrase alone: VP Verb PP leaving on Thursday A prepositional phrase generally has a preposition followed by a noun phrase. For example, a very common type of prepositional phrase in the ATIS corpus is used to indicate location or direction: PP Preposition NP from Los Angeles February 11, 2012 Veton Kpuska 26 The NP inside a PP need not be a location; PPs are often used with times and dates, and with other nouns as well; they can be arbitrarily complex. Here are ten examples from the ATIS corpus: to Seattle in Minneapolis on Wednesday in the evening on the ninth of July on these flights about the ground transportation in Chicago of the round trip flight on United Airlines of the AP fifty seven fligh with a stopover in Nashville February 11, 2012 Veton Kpuska 27 Lexicon L0 Noun flights | breeze | trip | morning | ... Verb is | prefer | like | need | want | fly Adjective cheapest | non-stop | first | latest | other | direct | ... Pronoun me | I | you | it | ... Proper-Noun Alaska | Baltimore | Los Angeles | Chicago | United | American | ... Determiner the | a | an | this | these | that | ... Preposition from | to | on | near | ... Conjunction and | or | but | ... February 11, 2012 Veton Kpuska 28 The Grammar for L0 with example phrases for each rule. S NP | | | | | | NP VP Pronoun Proper-noun Determiner Nominal Nominal Noun Noun Verb Verb NP Verb NP PP Verb PP Prepositional NP I + want a morning flight I Los Angeles a + flight morning + flight flights do want + a flight leave + Boston + in the morning leaving + on Thursday From + Los Angeles Nominal VP PP February 11, 2012 Veton Kpuska 29 Parse Tree for "I prefer a morning flight" S NP Pronoun I Verb prefer Det a Nominal Noun morning February 11, 2012 Veton Kpuska 30 VP NP Nominal Noun flight Bracketed Notation of a Parse Tree [S [NP [Pro I]] [VP [V prefer] [NP [Det a] [Nom [N morning] [Nom [N flight]]]]]] February 11, 2012 Veton Kpuska 31 Grammatical, Ungrammatical & Generative Grammars A CFG like that of L 0 defines a formal language. We have shown in the previous chapters that a formal language is a set of strings. Grammatical & Ungrammatical Sentences Sentences (strings of words) that can be derived by a grammar are in the formal language defined by that grammar, and are called grammatical sentences. Sentences that cannot be derived by a given formal grammar are not in the language defined by that grammar, and are referred to as ungrammatical. February 11, 2012 Veton Kpuska 32 Grammatical, Ungrammatical & Generative Grammars Hard line between "in" and "out" characterizes all formal languages since it is only a very simplified model of how natural languages really work. This is because determining whether a given sentence is part of a given natural language (say English) often depends on the context. Generative Grammar In linguistics, the use of formal languages to model natural languages is called generative grammar, since the language is defined by the set of possible sentences "generated" by the grammar. February 11, 2012 Veton Kpuska 33 Formal Definition of ContextFree Grammar A contextfree grammar G is defined by four parameters N, , R (or P), S ( technically "is a 4tuple"): N a set of nonterminal symbols (or variables) a set of terminal symbols (disjoint from N) R (or P) a set of rules (or productions), each of the from A , where A is a nonterminal, is a string of symbols from the infinite set of strings ( U N) S a designated start symbol February 11, 2012 Veton Kpuska 34 Notational Convention For the remainder of the book we'll adhere to the following conventions when discussing the formal properties (as opposed to explaining particular facts about English or other languages) of contextfree grammars. Capital letters like A, B, and S Nonterminals S The start symbol Lowercase Greek letters like , , and Strings drawn from ( U N) Lowercase Roman letters like u, v, and w Strings of terminals February 11, 2012 Veton Kpuska 35 Derivation A language is defined via the concept of derivation. One string derives another one if it can be rewritten as the second one via some series of rule applications. More formally, following Hopcroft and Ullman (1979), if A is a production of P, and and are any strings in the set ( UN), then we say that A directly derives , or A . February 11, 2012 Veton Kpuska 36 Derivation Derivation is then a generalization of direct derivation: Let 1, 2, . . . , m be strings in ( UN), m 1, such that 1 2, 2 3,..., m2 m1, m1 m We say that 1 derives m, or 1 m. * Formally then we define the language L G generated by a grammar G as the set of strings composed of terminal symbols which can be derived from the designated start symbol S. * L G = w | w is in * and S w The problem of mapping from a string of words to its parse tree is called parsing; algorithms for parsing are covered in Ch. 13 and in Ch. 14. Veton Kpuska 37 February 11, 2012 Some Grammar Rules for English ATIS focus Reference Grammar of English Huddleston, R. and Pullum, G. K. (2002). The Cambridge grammar of the English language. Cambridge University Press. Hudson, R. A. (1984). Word Grammar. Basil Blackwell, Oxford. February 11, 2012 Veton Kpuska 39 Sentence Level Constructions There are large number of constructions for English sentences; four are particularly common and important: 1. Declarative, 2. Imperative, 3. Yesno question, 4. Whquestion structure February 11, 2012 Veton Kpuska 40 Declarative Structure of Sentences Sentences with declarative structure have a subject noun phrase followed by a verb phrase, like "I prefer a morning flight". S NP VP Sentences with this structure have a great number of different uses (discussed in detail in Ch. 23). In following examples ATIS domain samples are presented: The flight should be at eleven a.m. tomorrow The return flight should leave at around seven p.m. I'd like to fly the coach discount class I want a flight from Ontario to Chicago I plan to leave on July first around six thirty in the evening February 11, 2012 Veton Kpuska 41 Imperative Structure of Sentences Sentences with imperative structure often begin with a verb phrase, and have no subject. They are called imperative because they are almost always used for commands and suggestions; in the ATIS domain they are commands to the system. Show the lowest fare Show me the cheapest fare that has lunch Give me Sunday's flights arriving in Las Vegas from New York City List all flights between five and seven p.m. Show me all flights that depart before ten a.m. and have first class fares Please list the flights from Charlotte to Long Beach arriving after lunch time Show me the last flight to leave S VP February 11, 2012 Veton Kpuska 42 YesNo Structure of Sentences Sentences with yesno question structure are often (though not always) used to ask questions (hence the name), and begin with an auxiliary verb, followed by a subject NP, followed by a VP. S Aux NP VP Here are some examples (note that the third example is not really a question but a command or suggestion; Ch. 23 of the text book will discuss the uses of these question forms to perform different pragmatic functions such as asking, requesting, or suggesting.) Do any of these flights have stops? Does American's flight eighteen twenty five serve dinner? Can you give me the same information for United? February 11, 2012 Veton Kpuska 43 Whword Structure of Sentences The most complex of the sentencelevel structures we will examine are the various wh structures. These are so named because one of their constituents is a whphrase, that is, one that includes a whword (who, whose, when, where, what, which, how, why). These may be broadly grouped into two classes of sentencelevel structures. The whsubjectquestion structure is identical to the declarative structure, except that the first noun phrase contains some whword. S Wh-NP VP What airlines fly from Burbank to Denver? Which flights depart Burbank after noon and arrive in Denver by six p.m? Whose flights serve breakfast? Which of these flights have the longest layover in Nashville? February 11, 2012 Veton Kpuska 44 Whword Structure of Sentences In the whnonsubject question structure, the whphrase is not the subject of the sentence, and so the sentence includes another subject. In these types of sentences the auxiliary appears before the subject NP, just as in the yesnoquestion structures. S Wh-NP VP Here is an example followed by a sample rule: What flights do you have from Burbank to Tacoma Washington? February 11, 2012 Veton Kpuska 45 Causes and Sentences Before we move on, we should clarify the status of the S rules in the grammars we just described. S rules are intended to account for entire sentences that stand alone as fundamental units of discourse. However, as we'll see, S can also occur on the righthand side of grammar rules and hence can be embedded within larger sentences. Clearly then there's more to being an S then just standing alone as a unit of discourse. February 11, 2012 Veton Kpuska 46 Causes and Sentences What differentiates sentence constructions (i.e., the S rules) from the rest of the grammar is the notion that they are in some sense complete. In this way they correspond to the notion of a clause in traditional grammars, which are often described as forming a complete thought. One way of making this notion of `complete thought' more precise is to say an S is a node of the parse tree below which the main verb of the S has all of its arguments. We'll define verbal arguments later, but for now let's just see an illustration from the Parse Tree for "I prefer a morning flight". The verb prefer has two arguments: the subject I (NP) and the object a morning flight (part of VP). One of the arguments appears below the VP node, but the other one, the subject NP, appears only below the S node. February 11, 2012 Veton Kpuska 47 The Noun Phrase Our L0 grammar introduced three of the most frequent types of noun phrases that occur in English: pronouns, propernouns, and the NP Det Nominal construction. While pronouns and propernouns can be complex in their own ways, the central focus of this section is on the last type since that is where the bulk of the syntactic complexity resides. We can view these noun phrases consisting of Let's take a close look at the various parts. a head the central noun in the noun phrase, along with various modifiers that can occur before or after the head noun. February 11, 2012 Veton Kpuska 48 The Determiner Noun phrases can begin with simple lexical determiners, as in the following examples: a stop those flights the flights any flights this flight some flights The role of the determiner in English noun phrases can also be filled by more complex expressions, as follows: United's flight United's pilot's union Denver's mayor's mother's canceled flight In these examples, the role of the determiner is filled by a possessive expression consisting of a noun phrase followed by an 's as a possessive marker, as in the following rule. Det NP s February 11, 2012 Veton Kpuska 49 The Determiner The fact that this (previous slide) rule is recursive (since an NP can start with a Det), will help us model the latter two examples above, where a sequence of possessive expressions serves as a determiner. Optional Determiner: There are also circumstances under which determiners are optional in English. For example, determiners may be omitted if the noun they modify is plural: Show me flights from San Francisco to Denver on weekdays February 11, 2012 Veton Kpuska 50 The Determiner As we saw earlier (in Ch. 5 of the text book), mass nouns also don't require determination. Recall that mass nouns often (not always) involve something that is treated like a substance (including e.g., water and snow), don't take the indefinite article "a", and don't tend to pluralize. Many abstract nouns are mass nouns (music, homework). Mass nouns in the ATIS domain include breakfast, lunch, and dinner: Does this flight serve dinner? February 11, 2012 Veton Kpuska 51 The Nominal The nominal construction follows the determiner and contains any pre and posthead noun modifiers. As indicated in grammar L0, in its simplest form a nominal can consist of a single noun. Nominal Noun As we'll see, this rule also provides the basis for the bottom of various recursive rules used to capture more complex nominal constructions. February 11, 2012 Veton Kpuska 52 Before the Head Noun A number of different kinds of word classes can appear before the head noun (the "postdeterminers") in a nominal. These include cardinal numbers, ordinal numbers, and quantifiers. Cardinal numbers (one, two, three, ...): two friends one stop Ordinal numbers: first, second, third, and so on, but also words like next, last, past, other, and another: the first one the next day the second leg the last flight the other American flight Quantifiers: Some quantifiers ( many, (a) few, Several occur only with plural count nouns: many fares The quantifiers much and a little occur only with noncount nouns. Adjectives occur after quantifiers but before nouns. a firstclass fare a nonstop flight the longest layover the earliest lunch flight February 11, 2012 Veton Kpuska 53 Before the Head Noun Adjectives can also be grouped into a phrase called an adjective phrase or AP. APs can have an adverb before the adjective (see Ch. 5 for definitions of adjectives and adverbs): We can combine all the options for prenominal modifiers with one rule as follows: the least expensive fare NP (Det) (Card) (Ord) (Quant) (AP) Nominal February 11, 2012 Veton Kpuska 54 After the Head Noun A head noun can be followed by postmodifiers. Three kinds of nominal postmodifiers are very common in English: prepositional phrases nonfinite clauses all flights from Cleveland any flights arriving after eleven a.m. a flight that serves breakfast relative clauses February 11, 2012 Veton Kpuska 55 After the Head Noun Prepositional phrase postmodifiers are particularly common in the ATIS corpus, since they are used to mark the origin and destination of flights. Here are some examples, with brackets inserted to show the boundaries of each PP; note that more than one PP can be strung together: any stopovers [for Delta seven fifty one] all flights [from Cleveland] [to Newark] arrival [in San Jose] [before seven p.m.] a reservation [on flight six oh six] [from Tampa] [to Montreal] Nominal rule to account for postnominal PPs: Nominal Nominal PP February 11, 2012 Veton Kpuska 56 Nonfinite Postmodifiers The three most common kinds of nonfinite postmodifiers are the gerundive (ing), ed, and infinitive forms. Gerundive postmodifiers are socalled because they consist of a verb phrase that begins with the gerundive (ing) form of the verb. In the following examples, the verb phrases happen to all have only prepositional phrases after the verb, but in general this verb phrase can have anything in it (anything, that is, which is semantically and syntactically compatible with the gerund verb). any of those [leaving on Thursday] any flights [arriving after eleven a.m.] flights [arriving within thirty minutes of each other] February 11, 2012 Veton Kpuska 57 Gerundive Postmodifiers We can define the Nominals with gerundive modifiers as follows, making use of a new nonterminal GerundVP: Nominal Nominal GerundVP We can make rules for GerundVP constituents by duplicating all of our VP productions, substituting GerundV for V. GerundVP GerundV NP | GerundV PP | GerundV | GerundV NP PP GerundV can then be defined as: GerundV being | arriving | leaving | . . . February 11, 2012 Veton Kpuska 58 Nonfinite Postmodifiers The phrases in italics below are examples of the two other common kinds of nonfinite clauses: infinitives and ed forms: the last flight to arrive in Boston I need to have dinner served Which is the aircraft used by this flight? February 11, 2012 Veton Kpuska 59 Relative Pronoun A postnominal relative clause (more correctly a restrictive relative clause), is a clause that often begins with a relative pronoun: The relative pronoun functions as the subject of the embedded verb (is a subject relative) in the following examples: a flight that serves breakfast flights that leave in the morning the United flight that arrives in San Jose around ten p.m. the one that leaves at ten thirty five that and who are the most common. February 11, 2012 Veton Kpuska 60 Relative Pronoun Adding rules like the following to deal with relative pronoun: Nominal Nominal RelClause RelClause (who | that) VP The relative pronoun may also function as the object of the embedded verb, as in the following example; we leave as an exercise for the reader writing grammar rules for more complex relative clauses of this kind. the earliest American Airlines flight that I can get February 11, 2012 Veton Kpuska 61 Postnominal Modifiers Various postnominal modifiers can be combined, as the following examples show: a flight [from Phoenix to Detroit] [leaving Monday evening] I need a flight [to Seattle] [leaving from Baltimore] [making a stop in Minneapolis] evening flights [from Nashville to Houston] [that serve dinner] a friend [living in Denver] [that would like to visit me here in Washington DC] February 11, 2012 Veton Kpuska 62 Before the Noun Phrase Word classes that modify and appear before NPs are called predeterminers. Many of these have to do with number or amount; a common predeterminer is all: all the flights all flights all nonstop flights The example noun phrase given in the next slide illustrates some of the complexity that arises when these rules are combined. February 11, 2012 Veton Kpuska 63 Parse Tree for "all the morning flights from Denver to Tampa leaving before 10" NP PreDet all Det the Nominal Nominal Nominal Nominal Noun morning February 11, 2012 Veton Kpuska 64 PP PP to Tampa Noun from Denver flights NP Nominal GerundiveVP leaving before 10 Agreement From inflectional morphology of English, most verbs can appear in two forms in present tense: The form used for thirdperson, singular subjects: The flight does Form used for all other kinds of subjects: All the flights do, ... I do ... 3rd person singular (3sg) form typically has a final s where the non3sg form does not. Examples using verb do: Do [NP all of these flights] offer first class service? Do [NP I] get dinner on this flight? Do [NP you] have a flight from Boston to Forth Worth? Does [NP this flight] stop in Dallas? February 11, 2012 Veton Kpuska 65 Agreement Examples with the verb leave: What flights leave in the morning? What flight leaves from Pittsburgh? This agreement phenomenon occurs whenever there is a verb that has some noun acting as its subject. Ungrammatical examples where the subject does not agree with the verb: *[What flight] leave in the morning? *Does [NP you] have a flight from Boston to Forth Worth? *Do [NP this flight] stop in Dallas? February 11, 2012 Veton Kpuska 66 FS Grammar & Agreement How can we modify our grammar to handle these agreement phenomena? One way is to expand our grammar with multiple sets of rules, one rule set for 3sg subjects, and one for non3sg subjects. For example, the rule that handled these yesnoquestions used to look like this: S Aux NP VP February 11, 2012 Veton Kpuska 67 FS Grammar & Agreement We could replace this with two rules of the following form: S 3sgAux 3sgNP VP S Non3sgAux Non3sgNP VP We could then add rules for the lexicon like these: 3sgAux does | has | can | . . . Non3sgAux do | have | can | . . . February 11, 2012 Veton Kpuska 68 FS Grammar & Agreement Observation: We would also need to add rules for 3sgNP and Non3sgNP, again by making two copies of each rule for NP. While pronouns can be first, second, or third person, full lexical noun phrases can only be third person, so for them we just need to distinguish between singular and plural (dealing with the first and second person pronouns is left as an exercise): February 11, 2012 Veton Kpuska 69 FS Grammar for Handling Singulars and Plurals 3SgNP Det SgNominal Non3SgNP Det PlNominal SgNominal SgNoun PlNominal PlNoun SgNoun flight | fare | dollar | reservation | . . . PlNoun flights | fares | dollars | reservations | . . . February 11, 2012 Veton Kpuska 70 Problem Increasing of the size of the Grammar: The problem with this method of dealing with number agreement is that it doubles the size of the grammar. Every rule that refers to a noun or a verb needs to have a "singular" version and a "plural" version. Unfortunately, subjectverb agreement is only the tip of the iceberg. We'll also have to introduce copies of rules to capture the fact that head nouns and their determiners have to agree in number as well: this flight those flights *this flights *those flight February 11, 2012 Veton Kpuska 71 Rule Proliferation Rule proliferation will also have to happen for the noun's case; for example English pronouns have nominative (I, she, he, they) and accusative (me, her, him, them) versions. We will need new versions of every NP and N rule for each of these. These problems are compounded in languages like German or French, which not only have numberagreement as in English, but also have gender agreement: The gender of a noun must agree with the gender of its modifying adjective and determiner. This adds another multiplier to the rule sets of the language. February 11, 2012 Veton Kpuska 72 How to handle Rule Proliferation Ch. 16 of the text book introduces a way to deal with these agreement problems without exploding the size of the grammar, by effectively parameterizing each nonterminal of the grammar with feature structures and unification. But for many practical computational grammars, we simply rely on CFGs and make do with the large numbers of rules. February 11, 2012 Veton Kpuska 73 The Verb Phrase and Subcatergorization The verb phrase consists of the verb and a number of other constituents. In the simple rules we have built so far, these other constituents include NPs and PPs and combinations of the two: VP VP VP VP Verb Verb NP Verb PP disapper prefer a morning flight leaving on Thursday Veton Kpuska 74 Verb NP PP leave Boston in the morning February 11, 2012 Sentential Complements Verb phrases can be significantly more complicated than examples in previous slide: Many other kinds of constituents can follow the verb, such as an entire embedded sentence. These are called sentential complements: You [VP [V said [S there were two flights that were the cheapest ]]] You [VP [V said [S you had a two hundred sixty six dollar fare]] [VP [V Tell] [NP me] [S how to get from the airport in Philadelphia to downtown]] I [VP [V think [S I would like to take the nine thirty flight]] February 11, 2012 Veton Kpuska 75 Here's a rule for these: VP Verb S Another potential constituent of the VP is another VP. This is often the case for verbs like want, would like, try, intend, need I want [VP to fly from Milwaukee to Orlando] Hi, I want [VP to arrange three flights] Hello, I'm trying [VP to find a flight that goes from Pittsburgh to Denver after`two p.m.] February 11, 2012 Veton Kpuska 76 Subcategories Traditional grammars subcategorize verbs into these two categories: transitive and intransitive, Modern grammars distinguish as many as 100 subcategories. In fact, tagsets for many such subcategorization frames exist; see Macleod et al. (1998) for the COMLEX tagset, Sanfilippo (1993) for the ACQUILEX tagset, and further discussion in Ch. 16 of the text book. February 11, 2012 Veton Kpuska 77 Subcategorization frames Frame NP NP NP PPfrom PPto NP PPwith VPto VPbrst S Verb eat, sleep prefer, find, leave, show, give fly, travel help, load, prefer, want, need can, would, might mean Example I want to eat Find [NP the flight from Pittsburgh to Boston] Show [NP me] [NP airlines with flights from Pittsburgh] I would like to fly [PP from Boston] [PP to Philadelphia] Can you help [NP me] [PP with a flight] I would prefer [VPto to go by United airlines] I can [VPbrst go from Boston] Does this mean [S AA has a hub in Boston]? Veton Kpuska 78 February 11, 2012 Relation of Verbs & their Complements How can we represent the relation between verbs and their complements in a contextfree grammar? One thing we could do is to do what we did with agreement features: make separate subtypes of the class Verb (VerbwithNPcomplement, VerbwithInfVPcomplement, VerbwithScomplement, and so on): VerbwithNPcomplement find | leave | repeat | ... VerbwithScomplement think | believe | say |... VerbwithInfVPcomplement want | try | need |... Each VP rule could be modified to require the appropriate verb subtype: VP Verbwithnocomplement disappear VP VerbwithNPcomp NP prefer a morning flight VP VerbwithScomp S said there were two flights February 11, 2012 Veton Kpuska 79 Problem with explosion of Number of Rules The standard solution to both of these problems is the feature structure, which will be introduced in Ch. 16 where we will also discuss the fact that nouns, adjectives, and prepositions can subcategorize for complements just as verbs can. February 11, 2012 Veton Kpuska 80 Auxiliaries Auxiliaries or helping verbs are a subclass of verbs. They have a particular syntactic constrains which can be viewed as a kind of subcategorization. Auxiliaries include: Modal verbs: Perfect auxiliary: can, could, may, might, must, will, would, shall, and should Have Be Be Progressive auxiliary: Passive auxiliary: February 11, 2012 Veton Kpuska 81 Auxiliaries Modal verbs subcategorize a VP whose head verb is a bare stem; Example: Perfect verb have subcategorizes for a VP whose head verb is the past participle form: can go in the morning, will try to find a flight. Progressive verb be subcategorizes for a VP whose head verb is the gerundive participle: have booked 3 flights. Passive verb be subcategorizes for a VP whose head verb is the past participle: am going from Atlanta. was delayed by inclement weather. Veton Kpuska February 11, 2012 82 Auxiliaries A sentence can have multiple auxiliary verbs, but they must occur in a particular order: modal < perfect < progressive < passive Some examples of multiple auxiliaries model perfect could have been a contender modal passive will be married perfect progressive have been fasting February 11, 2012 Veton Kpuska 83 Coordination The major phrase types discussed here can be conjoined with conjunctions like and, or, & But to form larger constructions of the same type. For example a coordinate noun phrase can consist of two other noun phrases separated by a conjunction: Please repeat [NP [NP the flights] and [NP the costs]] I need to know [NP [NP the aircraft] and [NP the flight number]] The fact that these phrases can be conjoined is evidence for the presence of the underlying Nominal constituent we have been making use of. Here's a new rule for this: Nominal Nominal and Nominal Veton Kpuska 84 February 11, 2012 Conjunctions involving VP's and S's Examples: What flights do you have [VP [VP leaving Denver] and [VP arriving in San Francisco]] [S [S I'm interested in a flight from Dallas to Washington] and [S I'm also interested in going to Baltimore]] The rules for VP and S conjunctions mirror the NP one given previously. VP VP and VP S S and S February 11, 2012 Veton Kpuska 85 Conjunction of major phrase types Generalization of conjunction rule via a metarule: X X and X This metarule simply states that any nonterminal can be conjoined with the same nonterminal to yield a constituent of the same type. Of course, the variable X must be designated as a variable that stands for any nonterminal rather than a non terminal itself. February 11, 2012 Veton Kpuska 86 Treebanks Treebanks and ContextFree Grammars Contextfree grammar rules of the type that have been explored so far in this chapter can be used, in principle, to assign a parse tree to any sentence. This means that it is possible to build a corpus in which every sentence is syntactically annotated with a parse tree. Such a syntactically annotated corpus is called a treebank. Treebanks play an important roles in parsing (covered in Ch. 13 of the textbook), and in various empirical investigations of syntactic phenomena. February 11, 2012 Veton Kpuska 88 Treebanks and Parsers A wide variety of treebanks have been created, generally by using parsers (of the sort described in the chapters 13 and 14 of the textbook) to automatically parse each sentence, and then using humans (linguists) to handcorrect the parses. The Penn Treebank project has produced treebanks from the Other treebanks include the Brown, Switchboard, ATIS, and Wall Street Journal corpora of English, as well as treebanks in Arabic and Chinese. The Prague Dependency Treebank for Czech, T The Negra treebank for German, and The Susanne treebank for English. February 11, 2012 Veton Kpuska 89 The Penn Treebank Project Example Brown Corpus ATIS Corpus (NP-SBJ The/DT flight/NN ) (VP should/MD (VP arrive/VB (PP-TMP at/IN (NP eleven/CD a.m/RB )) (NP-TMP tomorrow/NN ))))) ((S (NP-SBJ (DT That) (JJ cold) (, ,) (JJ empty) (NN sky) ) (VP (VBD was) (ADJP-PRD (JJ full) (PP (IN of) (NP (NN fire) (CC and) (NN light) )))) (. .) )) February 11, 2012 ((S Veton Kpuska 90 Standard Tree Representation of Brown Corpus S NPSBJ DT JJ , JJ NN VBD was JJ full IN VP ADJPPRD P P NP . . That cold , empty sky morning NN CC NN fire and light February 11, 2012 Veton Kpuska 91 A Sample of the CFG Grammar Extracted from the Treebank S NP VP . NP VP " S " , NP VP . NONE DT NN DT NN NNS NN CC NN CD RB DT JJ , JJ NN PRP NONE MD VP VBD ADJP VBD S VB PP VB S VB SBAR VBP VP VBN VP TO VP IN S JJ PP IN NP PRP we | he DT the | that | those JJ cold | empty | full NN sky | fire | light | flight NNS assets CC and IN of | at | until | on CD eleven RB a.m VB arrive | have | wait VBD said VBP have VBN collected MD should | would TO to NP VP SBAR ADJP PP February 11, 2012 Veton Kpuska 92 Using Treebanks as a Grammar The sentences in a treebank implicitly constitute a grammar of the language. For example, we can take the three parsed sentences in slide The Penn Treebank Project Example and extract each of the CFG rules in them. For simplicity, let's strip off the rule suffixes (SBJ and so on). The resulting grammar is shown previous slide. Penn Treebank results in 4,500 different rules for expanding VP and PP. Penn Treebank III and Wall Street Journal corpus 1 mil words, 1 mil nonlexical rule tokens, and 17,500 distinct rule types. February 11, 2012 Veton Kpuska 93 Searching Treebanks It is often important to search through a treebank to find examples of particular grammatical phenomena, either for linguistic research or for answering analytic questions about a computational application. But neither the regular expressions used for text search nor the boolean expressions over words used for web search are a sufficient search tool. What is needed is a language that can specify constraints about nodes and links in a parse tree, so as to search for specific patterns. Tgrep: Various such treesearching languages exist in different tools. Tgrep (Pito, 1993) and TGrep2 (Rohde, 2005) are publicly available tools for searching treebanks that use a similar language for expressing tree constraints. February 11, 2012 Veton Kpuska 94 Heads and Head Finding It was suggested earlier in this chapter that syntactic constituents could be associated with a lexical head; N is the head of an NP, V is the head of a VP. This idea of a head for each constituent dates back to Bloomfield (1914). It is central to such linguistic formalisms such as HeadDriven Phrase Structure Grammar (Pollard and Sag, 1994), and has become extremely popular in computational linguistics with the rise of lexicalized grammars (see Ch. 14 of the textbook). February 11, 2012 Veton Kpuska 95 Heads and Head Finding In one simple model of lexical heads, each contextfree rule is associated with a head (Charniak, 1997; Collins, 1999). The head is the word in the phrase which is grammatically the most important. Heads are passed up the parse tree; thus each nonterminal in a parsetree is annotated with a single word which is its lexical head. Fig. in the next slide shows an example of such a tree from Collins (1999), in which each nonterminal is annotated with its head. "Workers dumped sacks into a bin" is a shortened form of a WSJ sentence. February 11, 2012 Veton Kpuska 96 A lexicalized tree from Collins (1999) February 11, 2012 Veton Kpuska 97 Head Finding An practical approach to headfinding is used in most computational systems: Instead of specifying head rules in the grammar itself, heads are identified dynamically in the context of trees for specific sentences. In other words, once a sentence is parsed, the resulting tree is walked to decorate each node with the appropriate head. Most current systems rely on a simple set of handwritten rules, such as a practical one for Penn Treebank grammars given in Collins (1999) but developed originally by Magerman (1995). For example their rule for finding the head of an NP is as follows Collins (1999, 238): February 11, 2012 Veton Kpuska 98 Head Finding Rules If the last word is tagged POS, return lastword. Else search from right to left for the first child which is an NN, NNP, NNPS, NX, POS, or JJR. Else search from left to right for the first child which is an NP. Else search from right to left for the first child which is a $, ADJP, or PRN. Else search from right to left for the first child which is a CD. Else search from right to left for the first child which is a JJ, JJS, RB or QP. Else return the last word Veton Kpuska 99 February 11, 2012 Grammar Equivalence and Normal Form We could ask if two grammars are equivalent by asking if they generate the same set of strings. In fact it is possible to have two distinct contextfree grammars generate the same language. We usually distinguish two kinds of grammar equivalence: weak equivalence and strong equivalence. Two grammars are strongly equivalent if they generate the same set of strings and if they assign the same phrase structure to each sentence (allowing merely for renaming of the nonterminal symbols). Two grammars are weakly equivalent if they generate the same set of strings but do not assign the same phrase structure to each sentence. February 11, 2012 Veton Kpuska 101 Normal Form It is sometimes useful to have a normal form for grammars, in which each of the productions takes a particular form. For example a contextfree grammar is in Chomsky Normal Form (CNF) (Chomsky, 1963) if it is e free and if in addition each production is either of the form A B C or Aa. That is, the righthand side of each rule either has two nonterminal symbols or one terminal symbol. Chomsky normal form grammars are binary branching, i.e. have binary trees (down to the prelexical nodes). February 11, 2012 Veton Kpuska 102 Normal Form Any grammar can be converted into a weaklyequivalent Chomsky normal form grammar. For example, a rule of the form A B C D can be converted into the following two CNF rules (Exercise 12.11 asks the reader to formulate the complete algorithm): A B X X C D February 11, 2012 Veton Kpuska 103 Example Penn Treebank Grammar The following rule VP VBD NP PP* Represented as: VP VBD PP VP VBD PP PP VP VBD PP PP PP VP VBD PP PP PP PP ... could also be generated by the following tworule grammar: VP VBD PP VP VP PP To generate a symbol A with a potentially infinite sequence of symbols B by using a rule of the form A A B is known as Chomsky adjunction. February 11, 2012 Veton Kpuska 104 FiniteState & ContextFree Grammars FiniteState & ContextFree Grammars Adequate models of grammar need to be able to represent complex interrelated facts about: Constituency Subcategorization, and Dependency relations At the least the power of contextfree grammars is needed to accomplish this. Why finitestate methods are not adequate to capture these syntactic facts? 1. Given certain assumptions, that certain syntactic structures present in English (and other natural languages) make them not regular languages. 2. When finitestate methods are capable of dealing with the syntactic facts in question, they often don't express them in ways that make generalizations obvious. February 11, 2012 Veton Kpuska 106 FiniteState & ContextFree Grammars There is a completely equivalent alternative to finitestate machines and regular expressions for describing regular languages, called regular grammars. The rules in a regular grammar are a restricted form of the rules in a contextfree grammar because they are in rightlinear or leftlinear form. In a rightlinear grammar, for example, the rules are all of the form These rules look an awful lot like the rules we've been using throughout this chapter, so what can't they do? Aw or AwB, that is the nonterminals either expand to a string of terminals or to a string of terminals followed by a nonterminal. What they can't do is express recursive centerembedding rules like the following, where a nonterminal is rewritten as itself, surrounded by (nonempty) strings: A A February 11, 2012 Veton Kpuska 107 * FiniteState & ContextFree Grammars For many practical purposes where matching syntactic and semantic rules aren't necessary, finitestate rules are quite sufficient. Thus, it is possible to automatically build a regular grammar which is an approximation of a given contextfree grammar; see the references at the end of the chapter. February 11, 2012 Veton Kpuska 108 Dependency Grammar See the textbook for further information of other types of contextfree grammars: Dependency grammar, Categorical grammar, etc. February 11, 2012 Veton Kpuska 109 Spoken Language Syntax Spoken Language Syntax The grammar of written English and the grammar of conversational spoken English Share many features, but Differ in number of respects. Utterance term used to describe a unit of a spoken language Sentence term used to describe a unit of a written English. Comma "," marks a short pause Period "." marks a long pause. Fragments incomplete words like wha- (what) are marked with a dash Square brackets "[smack]" mark nonverbal events (lipssmacks, breaths, coughs, etc.) February 11, 2012 Veton Kpuska 111 Spoken Language Syntax Differences of spoken language syntax and written language syntax: Spoken language has higher number of pronouns. Subject of a utterance is almost invariably a pronoun. Utterances often consists of short fragments or phrases: One way, or Around four p.m. Utterances have phonological, prosodic, and acoustic characteristics that of written sentences do not have. Utterances (spoken sentences) have various kinds of disfluencies, hesitations, repair, restarts, etc. February 11, 2012 Veton Kpuska 112 Disfluencies and Repair Most salient (prominent, especially significant) is the phenomena know as disfluencies (individual) and (collectively) as repair. Disfluencies: Use fo the words Uh, Um Word repetitions Restarts, and Word fragments. Example of the ATIS utterance: Does American Airlines offer any one-way flights [uh] one-way fares for 160 dollars. Reparandum Repair Veton Kpuska 113 Editing Phase February 11, 2012 Disfluencies and Repair Reparandum a phrase that needs correcting The interruption point breaking off point of the original sequence of words Editing Phase often called edit terms and often the following phrases are used: You know I mean Uh, and Um Filled pauses or Fillers (uh, um, ...) are generally treated like regular words in speech recognition lexicons and grammars. Repair corrected phrase. Veton Kpuska 114 February 11, 2012 Treebanks for Spoken Language Detection of Disfluencies and correction is one of most important research topics in speech understanding. Disfluencies are very common: 37% of Switchbord corpus has sentences with more than two words being disfluent in some way. "Word" uh is one of the most frequent words in Switchboard. In order to facilitate this research Swichboard corpus corpus treebank was augmented to include spoken language phenomena like disfluencies. Example: But I don't have [ any, + {F uh, } any ] real idea Veton Kpuska 115 February 11, 2012 Grammar and Human Processing Grammars and Human Processing Do people use contextfree grammars in their mental processing of language? It has proved very difficult to find clearcut evidence that they do. February 11, 2012 Veton Kpuska 117 Summary Summary Introduction of fundamental concepts in syntax via the contextfree grammar Constituent: groups of consecutive words that act as a group which can be modeled by context-free grammars. A contextfree grammar consists of a set of rules or productions that consist of a set of non-terminal symbols, and a terminal symbols. Context-free language is the set of strings which are derived form a particular contextfree grammar. A generative grammar is a traditional name in linguistics for a formal language which is used to model the grammar of a natural language. Veton Kpuska 119 February 11, 2012 Summary There are many sentencelevel grammatical constructions in English; An English noun phrase can have declarative, imperative, yesnoquestion, and whquestion are four very common types, which can be modeled with contextfree rules. determiners, numbers, quantifiers, and adjective phrases preceding the head noun, which can be followed by a number of postmodifiers; gerundive VPs, infinitives VPs, and past participial VPs are common possibilities. February 11, 2012 Veton Kpuska 120 Summary Subjects in English agree with the main verb in person and number. Verbs can be subcategorized by the types of complements they expect. Simple subcategories are The correlate of sentences in spoken language are generally called utterances. Utterances may be transitive and intransitive; most grammars include many more categories than these. Treebanks of parsed sentences exist for many genres of English and for many languages. Treebanks can be searched using tree search tools. Veton Kpuska 121 disfluent, containing filled pauses like um and uh, restarts, and repairs. February 11, 2012 Summary Contextfree grammars are more powerful than finitestate automata, but it is nonetheless possible to approximate a contextfree grammar with a FSA. There is some evidence that constituency plays a role in the human processing of language. February 11, 2012 Veton Kpuska 122 ...
View Full Document

Ask a homework question - tutors are online