Does MaxEnt Overgenerate? Implicational Universals in Maximum Entropy Grammar
DOI:
https://doi.org/10.3765/amp.v5i0.4260Abstract
A good linguistic theory should neither undergenerate (i.e., it should not miss any attested patterns) nor overgenerate (i.e., it should not predict any "unattestable" patterns). We investigate the question of overgeneration in Maximum Entropy Grammar (ME) in the context of basic syllabification (Prince and Smolensky 2004) and obstruent voicing (Lombardi 1999), using the theory's T-order as a measure of typological strength. We find that ME has non-trivial T-orders, but compared to OT and HG, they are relatively sparse and sometimes linguistically counterintuitive. The fact that many reasonable implicational universals fail under ME suggests that the theory overgenerates, at least in the two phonological examples we examine. More generally, our results serve as a reminder that linguistic theories should be evaluated in terms of both descriptive fit and explanatory depth. A good theory succeeds on both fronts: we want a flexible theory that best fits the data, but we also want an informative theory that excludes unnatural patterns and derives the correct implicational universals.Downloads
Published
2018-02-10
Issue
Section
Proceedings
License
Published by the LSA with permission of the author(s) under a CC BY 3.0 license.