Does MaxEnt Overgenerate? Implicational Universals in Maximum Entropy Grammar

Arto Anttila, Giorgio Magri


A good linguistic theory should neither undergenerate (i.e., it should not miss any attested patterns) nor overgenerate (i.e., it should not predict any “unattestable” patterns). We investigate the question of overgeneration in Maximum Entropy Grammar (ME) in the context of basic syllabification (Prince and Smolensky 2004) and obstruent voicing (Lombardi 1999), using the theory’s T-order as a measure of typological strength. We find that ME has non-trivial T-orders, but compared to OT and HG, they are relatively sparse and sometimes linguistically counterintuitive. The fact that many reasonable implicational universals fail under ME suggests that the theory overgenerates, at least in the two phonological examples we examine. More generally, our results serve as a reminder that linguistic theories should be evaluated in terms of both descriptive fit and explanatory depth. A good theory succeeds on both fronts: we want a flexible theory that best fits the data, but we also want an informative theory that excludes unnatural patterns and derives the correct implicational universals.

Full Text:



Copyright (c) 2018 Arto Anttila, Giorgio Magri