Solving the UG Problem

Denis Bouchard


Many generalizations are impossible to learn via primary linguistic data, so they are assumed to be part of our genetic endowment. Generativists have tried to reduce Universal Grammar (UG) to a minimum, in particular by appealing to computational efficiency. In principle, this is an important improvement. The bottom line, however, is how well this computational approach explains the data. Unfortunately, it does not. Thus current analyses of subject–AUX inversion still appeal implicitly to several UG constraints in addition to structure dependence. Moreover, this fails empirically even in the wildest cases, such as forming questions by reversing the word order of a declarative. Fortunately, there is a way out of this impasse. Learners realize that different orders of constituents correlate with different meanings. Generating Tense in Comp compositionally derives a polar interrogative interpretation. The logically prior properties of the perceptual and conceptual systems impose constraints that are sufficient to explain language acquisition.


Universal Grammar; learnability; structure dependence; semantics.

Full Text:


Copyright (c)