next up previous contents
Next: Head-driven Generation Up: A Powerful Grammar Formalism Previous: Post Correspondence problem

Conclusion

Constraint-based grammars can be used, in principle, both for parsing and generation. This is also true for grammars defined in $ \cal {R}$($ \cal {L}$). However, to use such grammars in a practically interesting way, the grammars need to be restricted in some way, as the parsing and generation problem of $ \cal {R}$($ \cal {L}$) grammars is undecidable. Historically, the restriction has been (in order for parsing to be efficient), that phrases are built by concatenation. No restriction for generation was assumed, as generation played a minor role. But, generation on the basis of declarative grammars is not without problems. For example, the simple top-down, left-to-right backtrack search strategy leads to non-termination for linguistically motivated grammars. Chapter 3 discusses several (linguistically relevant) problems for some obvious generation techniques. Furthermore, that chapter provides motivation for a different processing strategy, in which generation proceeds essentially bottom-up, and head-driven. A simple version of this strategy is defined, and its properties and short-comings are investigated. Several variants and possible improvements are discussed. Relying on the notion `head' has some implications for the way in which semantic structures should be combined in the grammar. Essentially, the semantic-head-driven generation algorithm embodies the assumption that semantic structures are defined in a lexical and head-driven fashion. It turns out that the head-driven generation strategy faces problems with analyses in which this assumption is violated. As an important example, the semantic-head-driven generation strategy faces problems, if the head of a construction has been `displaced'. Such an analysis is often assumed for verb-second phenomena in for example German and Dutch, if the grammars are restricted to be concatenative. Thus, the assumption that semantic structures are built lexically and head-driven, does not make linguistic sense, if phrases are to be built by concatenation. In order for these assumptions to make linguistic sense, it is therefore necessary to have more freedom in the way phonological structures are combined. Thus, instead of concatenative grammars, non-concatenative grammars are called for.

In chapter 4 linguistic motivation for such non-concatenative operations on phonological representations is discussed. A number of proposals for operations like `head-wrapping' and `sequence-union' are described, and a class of formalisms is introduced in which operations on strings are allowed which are linear and non-erasing. Formalisms such as Head-Grammars [67] and Tree Adjoining Grammars [40], are members of this class. Clearly, parsing algorithms developed for concatenative formalisms cannot be used for these more powerful formalisms. An important question thus is how to parse with non-concatenative grammars. I describe a very general algorithm for this class of grammars which proceeds, again, head-driven. The motivation for such a processing strategy is discussed. Furthermore, I describe possible extensions and improvements of the basic algorithm. I also show how the parser can be `specialized' for lexicalized, and constraint-based, versions of Tree Adjoining Grammars.

In order for grammars to be effectively used for both parsing and generation, such grammars need to adhere to the assumptions that, on the one hand, semantic structures are built in a lexical and head-driven fashion, and on the other hand, that phonological structures are built in a linear and non-erasing way.


next up previous contents
Next: Head-driven Generation Up: A Powerful Grammar Formalism Previous: Post Correspondence problem
Noord G.J.M. van
1998-09-30