Aspects of Grammatical Architecture
eBook - ePub

Aspects of Grammatical Architecture

  1. 418 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Aspects of Grammatical Architecture

Book details
Book preview
Table of contents
Citations

About This Book

This volume collects eleven papers written between 1991 and 2016, some of them unpublished, which explore various aspects of the architecture of grammar in a minimalist perspective. The phenomena that are brought to bear on the architectural issue come from a range of languages, among them French, European Portuguese, Welsh, German and English, and include clitic placement, expletive pronouns, resumption, causative structures, copulative and existential constructions, VP ellipsis, as well as the distinction between the SVO, VSO and V2 linguistic types. This book sheds a new light on the division of labor between components and paves the way for further research on grammatical architecture.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Aspects of Grammatical Architecture by Alain Rouveret in PDF and/or ePUB format, as well as other popular books in Languages & Linguistics & Linguistics. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Routledge
Year
2018
ISBN
9781351622196
Edition
1

1 Introduction

1 The Architecture of Grammar and the Architecture of Syntactic Objects

This book is entitled Aspects of Grammatical Architecture for several reasons. First, although the various chapters deal with different issues, some of them central within the theory of generative grammar, others more peripheral, all of them share the assumption that only a derivational model endowed with a specific architecture can account for the complexity of language. Indeed, one of the distinguishing features of Chomskyan generative models, at least since Aspects, is the assumption that the grammar of natural languages, which by definition relates sound and meaning, is architecturally organized. The consensus today is that, when properly formalized, the tacit knowledge of linguistic structure that the speaker displays when he produces or understands fragments of his language takes the form of a computational model very similar to the Principles and Parameters framework. In this model, the grammar is conceived of as consisting of several levels of representation—D-structure, S-structure, Logical Form (LF), Phonetic Form (PF)—related by various mapping operations. These levels are distributed into components, depending on the kind of rule involved in their derivation. The two subderivations relating syntactic representations to LF-representations and to PF-representations take place within components that are distinct from each other and from syntax proper. This gives rise to a Y grammatical architecture. The properties of each level of representation are further determined by the combined effects of several modules, each dealing with a specific grammatical dimension (Case, θ-assignment, binding, movement).
As is well-known, this picture is drastically modified in the Minimalist Program, sketched by Chomsky (1993, 1995b), which is based on the claim, known as the “strong minimalist thesis” (SMT), that language is an optimal solution to the interface conditions that LF and PF must satisfy. The grammar contains no “internal” level anymore, only the interface ones, i.e. LF and PF, subsist; the idea that something like D-structure exists, i.e. a prebuilt categorial architecture observing X’-theory and serving as input to the rules of lexical insertion, is abandoned; there is a derivation, narrow syntax, that relates a lexical selection, the numeration, to the semantic interface; S-structure disappears as a level of representation and is henceforth conceived as Spell-Out, the point along this mapping where syntactic information is sent to the sound interface. All in all, the Y architecture survives, albeit in a different form.
Under the SMT, the question of the relation between syntactic derivations and the meaning and sound interfaces comes up in different terms than before. First, the SMT leads to the expectation that many properties of narrow syntax indirectly derive from the necessity to create linguistic objects that meet the requirements of the two cognitive systems, the conceptual-intentional system C-I and the sensorimotor system SM, with which the language faculty interfaces. Since the meaning and sound representations coincide with the two interface levels and since these levels are the only representational levels in the grammar, all conditions on representations hold either at LF or at PF. Second, the syntax-external character of some aspects of LF (θ-criterion, selectional restrictions, duality of semantics…) and PF (linearization…), which were duplicated within syntax in the Principles and Parameters framework, is fully acknowledged. They are henceforth reassigned to the external components, the semantic component Σ and the phonological component Φ. These theoretical stands have important consequences for the architecture of grammar. Syntactic processes have properties of their own. Some of them are not relevant to interpretation at all. Case-checking, φ-Agree and, to some extent, A-movement are cases in point. As for those, such as A’-movement, which indeed have a clear interpretive import, they cannot be said to be motivated by semantic considerations either. This point is clearly made by Uriagereka (2002: 212), who refers to this aspect of syntactic computations as “semantic blindness.” In other words, syntax just makes available syntactic objects that are taken advantage of to represent the richness of the semantics. The autonomy of the computational system and the eviction from syntax proper of (non-structural) semantic and phonological dimensions give rise to a grammatical architecture in which the relation between form and meaning is far from trivial and is certainly not as straightforward as some popular approaches assume.
Alternative architectures have been proposed. Brody (1995) develops a minimalist representational approach in which the lexical input is not related to the interface levels through a derivation—that is, through a finite sequence of computational steps, taken one at a time, with a beginning and an end. In his system, semantic interpretation rules and the lexicon have access to the same interface, the level of Lexico-Logical Form Jackendoff (1997) abandons the Y architecture completely and argues in favor of a tripartite parallel model, in which conceptual structures and phonological structures are assumed to be derived independently of and in parallel with syntactic structures. The narrow syntactic derivation converges “if it can be mapped through the interfaces into a well-formed phonological structure and conceptual structure” (Jackendoff 1997: 38–39). Although I am quite aware that these alternative options deserve careful attention, I remain loyal, throughout this book, to a derivational Y model of the type developed in the Minimalist Program.
As has often been observed, this model is also representational, since the complex objects that are produced by the computational process and transferred for interpretation to the semantic interface and for realization to the phonetic interface are themselves organized into hierarchically structured levels of representation, expressible as labeled bracketings. This is the second way in which the notion of architecture is directly relevant to the theory of grammar. A major topic, which has been revived by the advent of the Minimalist Program in the nineties, concerns the origin of phrase structure and the mechanisms that build it. It had been observed around 1980 that phrase structure rules could be eliminated on the ground that they were redundant given the specific properties of lexical items, X’-theory, and the general principles of U.G (cf. Chomsky 1981; Stowell 1981). As insightfully observed by Freidin and Vergnaud (2001: 644), this elimination left the grammar without any derivational mechanism for syntactic representations. It wasn’t until the definition of the theory of bare phrase structure in the mid-nineties that a new mechanism was introduced. Recall that, at the time, nothing like D-structure existed anymore. Chomsky (1995a) proposed that the hierarchical structure was put into place via the recursive application of the binary operation Merge, taking as input the functional and lexical items themselves and exclusively applying bottom-up. Since part of the information provided by the traditional X’-theory—categorial labels, bar levels—is not expressed anymore in a system based on recursive Merge, it became necessary to integrate into the grammar both a relational definition of levels (cf. Chomsky 1995a) and a labeling algorithm (cf. Chomsky 2008, 2013).
A second issue tightly connected with the origin of phrase structure is the emergence and the role of functional categories. The study of clausal structure across a large variety of languages in the eighties and the nineties has revealed that it was much more complex than previously thought and that additional syntactic space was needed to accommodate the complexity of syntactic, morphological and semantic phenomena. Functional categories were introduced—Inflection (Infl), Complementizer (C), Determiner (D)—“extending” lexical/contentful projections and making available more head positions—which constitute natural merger sites for functional items (inflectional markers, complementizers, determiners…) and potential landing sites for moved lexical categories (V, N…) —and more specifiers—hosting nominal expressions or adverbials in clausal domains and adjective phrases in nominal domains. Functional categories, together with the lexical (or functional) projections they select, determine the configurational structure of the major syntactic domains, including the sentence. The hypothesis that parameters are partly or exclusively associated with properties of functional heads (cf. Borer 1984) makes available principled analyses of word order variation across languages. Finally, since they are the designated sites for the insertion of functional items or features, functional categories directly contribute to the emergence of morphologically complex words. Linguists, during the Principles and Parameters era, devoted a lot of energy to establishing the inventory of the functional heads available and identifying the functional structure of phrasal and clausal domains, an effort that originates in Pollock’s (1989) classic article and culminates in the emergence of the cartographic movement.
Minimalism strengthens the role of functional categories even more. First, they are endowed with uninterpretable features that require to be valued. To a large extent, syntactic derivations are driven by the need to get rid of these features, which must have been eliminated when the derivation reaches the semantic interface and a specific operation intended to achieve this result is defined, namely Agree. Second, some designated functional categories (C, v) function as heads of phases, the domains which, in Chomsky’s (2000, 2001) approach, define computational units, Spell-Out units and interpretive units. Third, some approaches to morphology assume that functional heads are directly responsible for the categorial labeling of their lexical complement. In Marantz (1997), the previously assumed link between syntactic terminals and words is untied: the functional terminals are taken to consist of abstract formal features, which are phonologically realized only post-syntactically, and the lexical terminals to consist of roots, which have no categorial label. The categorial identity of a-categorial roots as well as the properties associated with the resulting morphologically complex words emerge from the syntactic combinatorial process associating roots with the formal features of functional heads. This conception, which takes the notion of word to be derivative, strongly recalls Chomsky’s (1995b) characterization of v, the lexico-functional head that combines with lexical V and is, in Chomsky’s view, an integral part of the derived verbal unit.
The third architectural issue that will be discussed here has just been alluded to and concerns the organization of the derivational process itself. Just as the grammatical system is segmented into components and requires that we have a clear idea of the division of labor between them, derivations are broken into phrases, i.e. into linguistic units that define locality domains for syntactic computation, semantic interpretation and phonological realization. This makes it necessary to explore the effects of the derivation by phase and to identify the designated functional categories that correspond to phase heads.
A final issue, which is a direct consequence of the minimalist shift of perspective, is the status of the grammatical principles that govern syntactic processes and their location in the overall dispositif. The fact that Minimalism sharply differs from earlier approaches in this respect is often overlooked. Linguistic research in the seventies and eighties had endeavored to simplify the formulation of operations by factoring out the recurrent conditions they observed and to propose general principles that were relevant to all of them. In the Principles and Parameters framework, each principle was associated with a specific module—i.e., with an autonomous subtheory of the grammatical system: the Case Filter was an integral part of Case theory, ECP and the head-movement constraint belonged to movement theory. Minimalism in turn attempted to define deeper principles of a much greater generality, from which the effects of the conditions active in the Principles and Parameters framework could be derived. These new theoretical constructs had to be considerably more abstract than the previous ones. They could not be rule-specific, as in the early days of generative grammar, nor construction-specific. They could no longer be integrated into specific modules either: modules are much demoted in significance in the minimalist approach. We expect them to be directly connected either to the necessity to achieve an optimal computational design or to deliver perfectly legible semantic and phonetic representations to C-I and SM. As Chomsky (2005: 9) observes, such notions as recursion, minimal search, locality fill the bill perfectly, as do extension, derivation by phase, full interpretation. But if one adopts the strictly derivational minimalist bias, the question arises as to where in the derivation, at which interface or in which component each Principle applies. In some cases, the answer is obvious. The conditions on Merge are operative all along the structure-building process. The Principle of Full Interpretation and bare output conditions are obviously relevant at the semantic and phonetic interfaces. But the layout just sketched leaves it as an open empirical question at which representational level the EPP and the various locality conditions apply. Solving this question is no doubt part of the architectural issue. First, by targeting specific points in derivations, general principles give them a distinguished status and contribute to the architecture of grammar. Second, one cannot exclude the possibility that the distinctive character of each component be manifested by the principles that are operative in it, as was the case of modules in the previous framework.
A fundamental question should be asked at this point: why is it necessary for the theory of syntax to build a dispositif of this kind and to postulate mechanisms and principles that far exceed what can be observed directly? As observed by Brody (1995: 1), the theories that account for the relationship between sound and meaning “with reasonable success” generally associate with each expression two detailed representations, a semantic and a phonetic one, which have to be related by various procedures, plausibly involving “complex representations that are composed of smaller units.” It is necessary to piece together the various subsystems the grammar is composed of into a cohesive whole and to make precise the way they interact. Providing the grammar with an internal architecture is just inevitable if one wants to capture the relevant connections. An immediate benefit that is expected from the architectural hypothesis is that it should contribute to reducing the complexity of linguistic knowledge to manageable proportions by modeling it as a system of overarching principles, components and levels, thus making accessible simpler and deeper analyses of empirical phenomena and providing the conceptual setting for dealing with the issue of language acquisition, language variation and language evolution. At a more general level, the architectural hypothesis is also a necessary ingredient of an approach that claims that a better understanding of what the language faculty might be can be attained if language is considered in its relation to the other cognitive systems of the mind-brain with which it interfaces. According to Chomsky (2006: 17), the architecture of grammar straightforwardly reflects the architecture of the language faculty, which is itself “inserted into a system of mind that has a certain architecture.” What one should find ultimately is that language is unique among biological systems and sharply differs from the systems with which it interacts.
Once it is taken for granted that the theory of grammar has to resort to a dispositif of this kind, other, more manageable questions arise, as, I hope, this book will make sufficiently clear.

2 Some Architectural Challenges

It is fair to acknowledge that, in spite of quite impressive conceptual premises and numerous empirical inquiries founded on them, the origin of phrase structure, the segmentation of grammar into components and the phasal architecture of derivations are areas where much empirical and theoretical work still needs to be done. Some questions have been left unsettled, others neglected or simply forgotten along the way. For example, we don’t know with any certainty yet whether bare phrase structure, which doesn’t make available all the structural and categorial information that was previously expressed in X’-theoretical terms, successfully supersedes X’-theory (cf. Gärtner’s 2002 insightful observations on this topic). It has also gone unnoticed that, if one adopts both Marantz’s compositional approach to the building of lexical categories and Chomsky’s labeling algorithm, we are left with two labeling mechanisms in the grammar. They largely differ in function and scope, but both rely on the claim that the categorial definition of a subset of syntactic objects is derivational. One should ask whether it is possible to integrate the two into a consistent model of grammar. Should they be kept distinct or unified? The questions raised by labeling procedures won’t be discussed in this book, although they undoubtedly constitute a major architectural issue.
Among those that are directly tackled here are two problems standing at the intersection of the general issues that have been identified in the preceding section. The first one concerns the articulation of morphology to syntax. The consensus in generative linguistic research is that words are endowed with an internal structure and that this structure is strikingly similar to that of phrases or clauses. As early as 1982, Selkirk stated that “word structure has the same general formal properties as syntactic structure” and that “it is generated by the same sort of rule system.” As a consequence, she proposed to extend the X’-hierarchy of projections below the level of the word (words correspond to the X°-level of structure, roots to the X-1-level). Under Minimalism, X’-theory cannot be involved and it is natural to assume that the word-building process resorts to the basic operation Merge. But once it is agreed that the derivational mechanisms involved are the same or, at least, of the same type, the question arises as to where word-formation processes are located. This issue has not been definitely settled today and still figures on the research agenda. Adopting Chomsky’s (1970) lexicalist position, Selkirk (1982: 2) assigns them to the lexicon, but she observes that locating them in the syntax would not make a great difference. Williams (1981, 2007), Di Sciullo and Williams (1987) provide compelling arguments that word-formation processes should be confined to the lexicon. Baker (1985, 1988), on the other hand, observes that the rule Move α in the Principles and Parameters framework should be allowed to affect X°-level units and to adjoin them to other X°-level units, giving rise to complex X°-level units, i.e. to morphologically complex words. The result of the head-movement process is a constituent structure that is both syntactic, because it is created in the syntax, and morphological, because it is not larger than a word. This theoretical move was made possible by Baker’s discovery that incorporation obeys familiar syntactic constraints and that inflectional categories such as tense, aspect and agreement in some languages and the passive or applicative markers in others, which are morphologically realized as affixes on lexical roots, are better viewed as heads in the syntax. The trend known as Distributed Morphology (cf. Halle and Marantz 1993, 1994) assumes that syntax performs all merger operations, including the composition of morphemes within a word and, at the same time, that morphophonology follows syntactic computation. If this radical step is taken, there is no need any more for a mirror principle à la Baker, stipulating that morphological derivations must reflect syntactic derivations, and vice versa. Finally, concerning the origin of (inflectionally) complex words, Chomsky (1995b) claims to adopt a lexicalist stand, but he combines this assumption with a checking mechanism, operating in the narrow syntax and intended to value uninterpretable...

Table of contents

  1. Cover
  2. Title
  3. Copyright
  4. Contents
  5. Acknowledgments
  6. 1 Introduction
  7. Part I Phrasal and Clausal Architecture
  8. Part II Clitics and Phrase Structure
  9. Part III The Architecture of Derivations
  10. Part IV The Architecture of Grammar
  11. References
  12. Index