Pre-training Mention Representations in Coreference Models
Yuval Varkel, Amir Globerson
Semantics: Sentence-level Semantics, Textual Inference and Other areas Short Paper
You can open the pre-recorded video in a separate window.
Abstract:
Collecting labeled data for coreference resolution is a challenging task, requiring skilled annotators. It is thus desirable to develop coreference resolution models that can make use of unlabeled data. Here we provide such an approach for the powerful class of neural coreference models. These models rely on representations of mentions, and we show these representations can be learned in a self-supervised manner towards improving resolution accuracy. We propose two self-supervised tasks that are closely related to coreference resolution and thus improve mention representation. Applying this approach to the GAP dataset results in new state of the arts results.
NOTE: Video may display a random order of authors.
Correct author list is at the top of this page.