Generative Grammar and the Faculty of Language: Discoveries, Questions, and Challenges

Chomsky et al. examine the foundations of generative grammar and the human language faculty. The article highlights core properties such as discrete infinity and displacement, explores operations like MERGE and AGREE, and analyses cognitive interfaces. Despite progress, unresolved questions remain, opening new directions for future linguistic research.

Chomsky, N. et al. (2019). Generative Grammar and the Faculty of Language: Discoveries, Questions, and Challenges. Catalan Journal of Linguistics. [Online] 229–261.

What Is Generative Grammar?

Generative Grammar (GG) studies linguistic capacity as part of human cognition. Initiated by Chomsky, it is based on the idea that humans uniquely generate an infinite number of meaningful expressions from a finite set of discrete units. The theory of Universal Grammar (UG) provides the principles that underlie this capacity. Although the evolutionary origin of UG is uncertain, theories agree that it emerged suddenly and relatively recently in human history.

Chomsky and the Origins of Universal Grammar

Chomsky’s work established that UG is not just a cultural artefact but a biological endowment. It explains why children can acquire language so rapidly and why all human languages share deep structural properties.


The Faculty of Language and Its Core Properties

Discrete Infinity in Linguistics

Human language exhibits discrete infinity: the ability to generate endless hierarchical structures from a limited set of rules and symbols. This property distinguishes human language from other communication systems.

Displacement as a Defining Feature

Displacement allows linguistic elements to appear in positions different from their original ones. For instance, in questions like What did you say __ yesterday?, the object what is displaced from its base position. This phenomenon requires a computational system capable of moving elements across structures.


Key Operations in Generative Grammar

External and Internal MERGE

The fundamental operation in GG is MERGE, which combines two elements into a new structured unit.

  • External MERGE (EM): combines independent elements.

  • Internal MERGE (IM): moves an element already inside a structure into a new position.

AGREE and TRANSFER Mechanisms

Other core operations include:

  • AGREE, which establishes agreement relations between syntactic categories.

  • TRANSFER, which sends linguistic structures to two interface systems: SEM (for meaning) and PHON (for sound or sign).


Interfaces Between Grammar and Cognition

Conceptual-Interpretive Interface (SEM)

The SEM interface maps linguistic structures to meaning. It determines how hierarchical structures are interpreted conceptually and ensures coherence between syntax and semantics.

Sensorimotor Interface (PHON)

The PHON interface externalises structures through speech or sign. It explains how abstract structures are converted into linear sequences that can be spoken, written, or signed.

Together, these interfaces make language both a cognitive and a communicative system.


Open Questions and Future Challenges

Efficiency Principles in Language

A central question is whether the properties of language can be explained by principles of computational efficiency. GG assumes the system is optimally designed, avoiding redundant operations.

Morphology, Phonology, and Variation

Another area of debate concerns how morphology and phonology contribute to cross-linguistic variation. Externalisation factors may explain why languages differ in form despite sharing universal properties.

Unresolved Issues

  • How does the system distinguish between copies and repetitions?

  • Must MERGE always produce endocentric structures with a clear label?

  • Is a special operation, PAIR-MERGE, necessary to account for adjunctions and coordination?


Conclusions on Generative Grammar and the Faculty of Language

Generative Grammar has achieved remarkable progress in explaining how language works as a computational system of the human mind. Yet, many questions remain open. Future research may reduce theoretical complexity and identify more fundamental properties of the language faculty.