All Together Now: Concurrent Learning of Multiple Structures in an Artificial Language

Cognitive Science 37 (7):1290-1320 (2013)
  Copy   BIBTEX

Abstract

Natural languages contain many layers of sequential structure, from the distribution of phonemes within words to the distribution of phrases within utterances. However, most research modeling language acquisition using artificial languages has focused on only one type of distributional structure at a time. In two experiments, we investigated adult learning of an artificial language that contains dependencies between both adjacent and non-adjacent words. We found that learners rapidly acquired both types of regularities and that the strength of the adjacent statistics influenced learning of both adjacent and non-adjacent dependencies. Additionally, though accuracy was similar for both types of structure, participants’ knowledge of the deterministic non-adjacent dependencies was more explicit than their knowledge of the probabilistic adjacent dependencies. The results are discussed in the context of current theories of statistical learning and language acquisition

Other Versions

No versions found

Links

PhilArchive

    This entry is not archived by us. If you are the author and have permission from the publisher, we recommend that you archive it. Many publishers automatically grant permission to authors to archive pre-prints. By uploading a copy of your work, you will enable us to better index it, making it easier to find.

    Upload a copy of this work     Papers currently archived: 106,168

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Analytics

Added to PP
2013-06-15

Downloads
69 (#332,425)

6 months
7 (#614,157)

Historical graph of downloads
How can I increase my downloads?