• 沒有找到結果。

LFG: A H ISTORICAL AND C ONTRASTIVE P ERSPECTIVE

Lexical Functional Grammar, a generative lexicalist unification grammar theory, was first systematically introduced by Kaplan and Bresnan (1982).

Like almost all other contemporary generative grammatical theories, LFG has its roots deeply embedded in Chomsky's early generative syntax and yet was

developed as an improvement of and a reaction to some of the inadequacies that the LFG theorists observed in the directions that the mainstream Chomskyan grammarians chose to follow. To understand the motivation and the development of LFG, it should reveal meaningful insights to first look at LFG within the context of the developmental stages of Chomsky's generative syntactic theory and in contrast with other contemporary grammatical theories.

Among other current syntactic theories we choose to compare LFG with Lexicase (LXC) and Government and Binding Theory (GB), which provide different perspectives than LFG and thus serve well as contrast.

1.2.1 The Revolution: The Transformational Generative Grammar Nearly all contemporary generative syntactic theories share a common ancestry: Chomsky's revolutionary work of generative transformational syntax of the late 1950's. We believe each of the three theories, in the areas it chooses to emphasize, represents a different reaction to, or extension of, Chomsky's earlier interpretation of syntax. From late 1950's to the present, Chomsky's syntactic theory has roughly undergone three perceivable developmental phases. Syntactic Structures (Chomsky 1957) revolutionized syntactic theorizing and trumpeted the advent of the era of generative grammar, and also firmly established the study of linguistics as a scientific pursuit. The proposal was that the objective of a grammar is to "generate" all and only the infinite grammatically acceptable strings of a natural language.

Thus the key point is that grammars should be 'generative', in a mathematical sense. Therefore, the criterion of explicit, formal and falsifiable formulation of linguistic statements and generalizations was greatly emphasized. The claim was that this goal can be obtained with a transformational grammar but not with a phrase structure grammar nor the earlier Structuralist approach to language.

1.2.2 The Second Stage: The Standard Theory

The 'Standard Theory' of Aspects of the Theory of Syntax (Chomsky 1965) posited two levels of syntactic representation: a deep structure as the basis for meaning interpretation and a surface structure as the basis for phonological interpretation. The crucial linking between the deep and surface structure is

LEXICALFUNCTIONALGRAMMARANDAVARIANTFORMALISM 5

accomplished by transformations. The focus of syntactic theory was that how a grammar should model the mental mapping between meaning and sound, and thus linguistic analyses should be psychologically real. The requirement of explicitness and formalness began to lose its earlier visibility.

The Standard Theory soon divided into several revised and extended versions by the late 1970's. The most salient and probably also most important trend of syntactic research was the restraining of powerful theoretical constructs such as transformation rules. Within the Chomskyan camp of transformational grammar the greatest effort has been in the elimination of various structure-specific transformations; however, layers of highly abstract constraints have to be devised to allow for one single general transformation:

Move-α. Other theories of different approaches, including LXC and LFG, on the other hand have totally ruled out the validity of transformations theoretically and treat syntax as a purely surface phenomenon. As a matter of fact, LXC, dating back to the early 1970's, was probably the first generative theory of syntax that was entirely transformationless.

1.2.3 The Third Stage: Government and Binding Theory

Lectures on Government and Binding (Chomsky 1981), introducing the Government and Binding Theory, embodied research within the Standard Theory and extended models concerned with the constraining of transformations and the attaining of the explanatory power of how only grammars learnable based on the primary data should be allowed in the theory of language. In other words, GB attempts to provide a theory where a grammar of a natural language can be inferred through a set of universal principles and the setting of certain universal parameters. In the pursuit of linguistic parameters and universal grammar, standards of explicit, formal, and detailed formulation of analysis of specific syntactic constructions waned and were even reproved. While there is a considerable amount of GB cross-language research on parameters of language variation and universal principles, one rarely finds an explicit formulation of an analysis of a specific syntactic construction of a particular language.

1.2.4 Lexicase and the Formal Rigor

While GB has compromised considerably the standards of formal explicitness and psychological realism, LXC and LFG to a significant degree represent serious efforts to reemphasize some of the worthwhile characteristics of earlier phases of Chomsky's generative theory. Both LFG and LXC claim their respective linguistic theory to be a universal model for all languages and therefore do stress upon the standard of explicit rigor, psychological reality, and the universality of their theory. LFG however has strong emphasis on the computational and psycholinguistic processing of language, and the LXC literature has demonstrated a most serious commitment to the formal and explicit formulation of linguistic generalizations. One of LXC's strongest objection to GB is that it is not clear whether GB can still be considered 'generative' in the original sense intended by Chomsky in the first stage of the late 1950's. Starosta, the primary theorist of LXC who had substantial training in physics, devoted an entire volume readdressing the issue of 'generative grammar' as a hypothetico-deductive science (Starosta 1987), and on numerous occasions repeated the generative aspect of LXC and that the goal of a LXC grammar is to generate all and only the acceptable phrases of which sentences are a subset. The LXC literature is therefore largely composed of detailed, explicit accounts of grammatical phenomena of various natural languages, most of which are non-IndoEuropean.

1.2.5 LFG and the Emphasis on Processing

LFG's concern for the processing aspect of language can no doubt be partially attributed to the two primary architects of the theory, Kaplan and Bresnan. The formal conception of LFG evolved in the mid-1970's from earlier work in both Transformational Grammar and computational linguistics.

Kaplan was a psychologist and did experimental work on human sentence processing and computational natural language processing. He was one of the designers of Augmented Transition Network (ATN) grammar, a computationally-oriented grammar which also served as one of the precursors of LFG. When making a transition from Transformational Grammar to LFG, Bresnan (1978) argued that the model she was proposing was psychologically

LEXICALFUNCTIONALGRAMMARANDAVARIANTFORMALISM 7

more realistic. This point was again crucially emphasized in the most important compilation of LFG work, The Mental Representation of Grammatical Relations (Bresnan 1982). LFG thus differs in being a linguistic theory with the goal to also serve as the grammatical basis of a computationally precise and psychologically realistic model of natural languages. Consequently, many of the theoretical decisions have been influenced by this perspective (Bresnan 1982, Sells 1985). The fact that a great number of research projects of natural language processing employ LFG-style formalisms also reveals LFG's commitment as a processing-oriented syntactic theory.

Another striking similarity that LFG shares with the Standard Theory is that as the Standard Theory identifies deep structure as the basis for semantic interpretation and surface structure as the basis for phonological interpretation, with transformational rules as the linkage, LFG also posits two levels of syntactic representation: the c-structure (constituent structure) which is the basis for phonological operation and f-structure (functional structure) from which the semantic representation is derived, and functional descriptions provide the linkage between c- and f-structures. However, it should be quickly pointed out that the similarity between deep/surface structure and c-/f-structure stops here. While deep and surface structures are two separate strata in the derivation process, c-structure and f-structure are associated with each other at any given point of the derivation and thus are co-descriptions of the same word string.

1.2.6 Points of Convergence

Despite the fact that these three theories make different choices of what to emphasize and differ in the assumptions they make for the basis of a syntactic theory, and also employ drastically different formalism, there are two significant points of convergence among them and perhaps other contemporary theories as well: the reduced role of transformations and the increased role of the lexicon. How to limit or eliminate the power of transformations and the shifting of emphasis to the lexicon thus are the two major trends in the study of syntax in the past two decades. GB has reduced the earlier various ad hoc and powerful transformational rules to just one:

Move-α (move anything to anywhere). LFG and LXC eliminated entirely the

theoretical validity of transformations and employed different analyses and/or lexical/morphological processes to account for syntactic phenomena that are previously accounted for by transformations.

The lexicon plays a central role in Lexical Functional Grammar and Lexicase, as their names suggest. Lexicase takes the most extreme position in that the lexicon of a language is the entire grammar of that language and all linguistic generalizations are expressed by lexical rules, which must be feature-preserving, i.e., they can only add but cannot delete or change features.

In LFG, every lexical entry has a set of functional expressions associated with it, and the f-structure of a phrase or a sentence is the result of unification of lexical functional structures according to the functional specifications associated with phrase structure rules that build the phrase or sentence. What most transformations used to perform now is handled by lexical rules, such as Passivization, Equi, and Raising. The significant similarity among all three theories is therefore that the clause structure of a verb to a large extend can be predicated by its semantic predicate structure. And the argument structure of a predicate is specified in the lexicon. The Projection Principle of GB and the Principle of Function-Argument Biuniqueness of LFG ensure that the predicate argument structure is realized structurally and may not be altered in essential ways. Such conditions are not necessary in LXC where contextual features associated with lexical items dictate entirely the possible phrase structures of a clause, and implied case relations, i.e., thematic relations roughly, are also specified in the lexical entry.

To view LFG through this historical and comparative frame is interesting and revealing; it seems that each of the theories discussed here, in their particular emphasis, presents a different reaction to Chomsky's later directions and represents a developmental stage of Chomsky's theory of syntax.

However, we by no means imply any accusation of LXC and LFG of reversing progress, for both theories do address the important issue of explanatory adequacy. Rather, we respect the commitment to formal rigor and processing efficiency in grammar writing on the part of LFG and LXC. We will come back to this point again when we discuss these three theories in CHAPTER 4 regarding their application in computational tasks of natural language processing.

LEXICALFUNCTIONALGRAMMARANDAVARIANTFORMALISM 9