Advancing Seq2seq with Joint Paraphrase Learning

So Yeon Min, Preethi Raghavan, Peter Szolovits

3rd Clinical Natural Language Processing Workshop (Clinical NLP 2020) Workshop Paper

You can open the pre-recorded video in a separate window.

Abstract: We address the problem of model generalization for sequence to sequence (seq2seq) architectures. We propose going beyond data augmentation via paraphrase-optimized multi-task learning and observe that it is useful in correctly handling unseen sentential paraphrases as inputs. Our models greatly outperform SOTA seq2seq models for semantic parsing on diverse domains (Overnight - up to 3.2% and emrQA - 7%) and Nematus, the winning solution for WMT 2017, for Czech to English translation (CzENG 1.6 - 1.5 BLEU).
NOTE: Video may display a random order of authors. Correct author list is at the top of this page.