Learnability Properties of Parametric Models for Natural Language Acquisition
Dissertation, Rutgers the State University of New Jersey - New Brunswick (
1995)
Copy
BIBTEX
Abstract
Parametric models of grammar have been proposed by linguists and psychologists throughout the past decade in order to explain variation among human languages and the ability of humans to learn them. In this dissertation I propose a rigorous and general definition of parametric models and show how long standing problems in language learnability such as Wexler's and Manzini's Dependence Problem can be solved on the basis of the proposed definition and independently of empirically motivated descriptive features of parametric models. I then use the proposed definition to describe a general method to find all the available maturational solutions to the local maxima problem encountered by Gibson's and Wexler's Triggering Learning Algorithm on parametric models of a certain nature. In so doing I offer a detailed description of the Maturational Triggering Learning Algorithm, an extension of the TLA. Finally, I rigorously define several ways to study partial descriptions of parametric models and for each of them I provide conditions that are sufficient to extend to the model as a whole a positive or negative learnability result that has been established about the partial description with regards to the TLA. A conclusive discussion is dedicated to a general method for evaluating possible parametric learning algorithms with respect to their ability to embody features that research in developmental psycholinguistics proves to be present in human learners