Article contents
How abstract is more abstract? Learning abstract underlying representations*
Published online by Cambridge University Press: 14 August 2017
Abstract
This paper presents a Maximum Entropy learner of grammars and lexicons (MaxLex), and demonstrates that MaxLex has an emergent preference for minimally abstract underlying representations. In order to keep the weight of faithfulness constraints low, the learner attempts to fill gaps in the lexical distribution of segments, making the underlying segment inventory more feature-economic. Even when the learner only has access to individual forms, properties of the entire system are implicitly available through the relative weighting of constraints. These properties lead to a preference for some abstract underlying representations over others, mitigating the computational difficulty of searching a large set of abstract forms. MaxLex is shown to be successful in learning certain abstract underlying forms through simulations based on the [i]~[Ø] alternation in Klamath verbs. The Klamath pattern cannot be represented or learned using concrete underlying representations, but MaxLex successfully learns both the phonotactic patterns and minimally abstract underlying representations.
- Type
- Articles
- Information
- Copyright
- Copyright © Cambridge University Press 2017
Footnotes
This paper has benefited greatly from comments on previous drafts by Karen Jesney, Reed Blaylock, Khalil Iskarous, Hayeon Jang, Roumyana Pancheva, Rachel Walker, as well as the editors and two anonymous reviewers. The work here has benefited greatly from discussions with the abovementioned people and Eric Baković, Michael Becker, Paul de Lacy, Josh Falk, Jeffrey Heinz, Brian Hsu, Martin Krämer, Giorgio Magri, Mairym Lloréns Monteserín, Joe Pater, Ezer Rasin, Jason Riggle, Stephanie Shih, Brian Smith and Adam Ussishkin, as well as audiences at OCP 12, SSILA 2015, CLS 51 and the USC PhonLunch.
The stimuli referred to in the paper are available as online supplementary materials at https://doi.org/10.1017/S0952675717000161.
References
REFERENCES
- 7
- Cited by