PyMT5: multi-mode translation of natural language and Python code with transformers

Colin Clement, Dawn Drain, Jonathan Timcheck, Alexey Svyatkovskiy, Neel Sundaresan

NLP Applications Long Paper

Zoom-15B: Nov 18, Zoom-15B: Nov 18 (23:00-00:00 UTC) [Join Zoom Meeting]

You can open the pre-recorded video in a separate window.

Abstract: Simultaneously modeling source code and natural language has many exciting applications in automated software development and understanding. Pursuant to achieving such technology, we introduce PyMT5, the Python method text-to-text transfer transformer, which is trained to translate between all pairs of Python method feature combinations: a single model that can both predict whole methods from natural language documentation strings (docstrings) and summarize code into docstrings of any common style. We present an analysis and modeling effort of a large-scale parallel corpus of 26 million Python methods and 7.7 million method-docstring pairs, demonstrating that for docstring and method generation, PyMT5~ outperforms similarly-sized auto-regressive language models (GPT2) which were English pre-trained or randomly initialized. On the CodeSearchNet test set, our best model predicts 92.1% syntactically correct method bodies, achieved a BLEU score of 8.59 for method generation and 16.3 for docstring generation (summarization), and achieved a ROUGE-L F-score of 24.8 for method generation and 36.7 for docstring generation.
NOTE: Video may display a random order of authors. Correct author list is at the top of this page.

Connected Papers in EMNLP2020

Similar Papers

Substance over Style: Document-Level Targeted Content Transfer
Allison Hegel, Sudha Rao, Asli Celikyilmaz, Bill Dolan,
CSP:Code-Switching Pre-training for Neural Machine Translation
Zhen Yang, Bojie Hu, Ambyera Han, Shen Huang, Qi Ju,
Sparse Text Generation
Pedro Henrique Martins, Zita Marinho, André F. T. Martins,