NeuSpell: A Neural Spelling Correction Toolkit

Sai Muralidhar Jayanthi, Danish Pruthi, Graham Neubig

Demo Paper

Gather-4K: Nov 18, Gather-4K: Nov 18 (02:00-04:00 UTC) [Join Gather Meeting]

Abstract: We introduce NeuSpell, an open-source toolkit for spelling correction in English. Our toolkit comprises ten different models, and benchmarks them on naturally occurring misspellings from multiple sources. We find that many systems do not adequately leverage the context around the misspelt token. To remedy this, (i) we train neural models using spelling errors in context, synthetically constructed by reverse engineering isolated misspellings; and (ii) use richer representations of the context. By training on our synthetic examples, correction rates improve by 9% (absolute) compared to the case when models are trained on randomly sampled character perturbations. Using richer contextual representations boosts the correction rate by another 3%. Our toolkit enables practitioners to use our proposed and existing spelling correction systems, both via a simple unified command line, as well as a web interface. Among many potential applications, we demonstrate the utility of our spell-checkers in combating adversarial misspellings. The toolkit can be accessed at neuspell.github.io.

Similar Papers

Sparse Text Generation
Pedro Henrique Martins, Zita Marinho, André F. T. Martins,
Pronoun-Targeted Fine-tuning for NMT with Hybrid Losses
Prathyusha Jwalapuram, Shafiq Joty, Youlin Shen,
On Losses for Modern Language Models
Stéphane Aroca-Ouellette, Frank Rudzicz,