Does MaxEnt Overgenerate? Implicational Universals in Maximum Entropy Grammar

Authors

  • Arto Anttila
  • Giorgio Magri CNRS, UiL-OTS

DOI:

https://doi.org/10.3765/amp.v5i0.4260

Abstract

A good linguistic theory should neither undergenerate (i.e., it should not miss any attested patterns) nor overgenerate (i.e., it should not predict any "unattestable" patterns). We investigate the question of overgeneration in Maximum Entropy Grammar (ME) in the context of basic syllabification (Prince and Smolensky 2004) and obstruent voicing (Lombardi 1999), using the theory's T-order as a measure of typological strength. We find that ME has non-trivial T-orders, but compared to OT and HG, they are relatively sparse and sometimes linguistically counterintuitive. The fact that many reasonable implicational universals fail under ME suggests that the theory overgenerates, at least in the two phonological examples we examine. More generally, our results serve as a reminder that linguistic theories should be evaluated in terms of both descriptive fit and explanatory depth. A good theory succeeds on both fronts: we want a flexible theory that best fits the data, but we also want an informative theory that excludes unnatural patterns and derives the correct implicational universals.

Downloads

Published

2018-02-10