Generalizing French Schwa Deletion: the Role of Indexed Constraints*
Keywords:Indexed constraints, French, Schwa deletion, Optionality, Variation, Lexically specific processes, Generalizability
Indexed constraints (like cophonologies) increase a grammar’s fit to seen data, but do they hurt the grammar’s ability to generalize to unseen data? We focus on French schwa deletion, an optional process whose rate of application is modulated by both phonological and lexical factors, and we propose three indexed constraint learners in the Maximum Entropy (MaxEnt) framework. Using data from Racine (2008), we test the ability of four learners to capture existing patterns and generalize to unseen data: three learners and a control MaxEnt learner without indexed constraint induction. The Indexed constraint learners indeed lead to better fit to the training data compared to the control. The resulting grammars are tested on a different schwa deletion dataset from Smith & Pater (2020). It is shown that indexed constraints do not lead to a drop in generalization to these data, and one of the indexation learners produces a grammar that predicts Smith & Pater’s data quite closely. We conclude that indexed constraints do not necessarily hurt a grammar’s ability to generalize to unseen data, while allowing the grammar to achieve a closer fit to training data.
Published by the LSA with permission of the author(s) under a CC BY 3.0 license.