Is Graph Structure Necessary for Multi-hop Question Answering?

Nan Shao, Yiming Cui, Ting Liu, Shijin Wang, Guoping Hu

Question Answering Short Paper

Zoom-11C: Nov 18, Zoom-11C: Nov 18 (08:00-09:00 UTC) [Join Zoom Meeting]

You can open the pre-recorded video in a separate window.

Abstract: Recently, attempting to model texts as graph structure and introducing graph neural networks to deal with it has become a trend in many NLP research areas. In this paper, we investigate whether the graph structure is necessary for textual multi-hop reasoning. Our analysis is centered on HotpotQA. We construct a strong baseline model to establish that, with the proper use of pre-trained models, graph structure may not be necessary for textual multi-hop reasoning. We point out that both graph structure and adjacency matrix are task-related prior knowledge, and graph-attention can be considered as a special case of self-attention. Experiments demonstrate that graph-attention or the entire graph structure can be replaced by self-attention or Transformers.
NOTE: Video may display a random order of authors. Correct author list is at the top of this page.

Connected Papers in EMNLP2020

Similar Papers

Modeling Global and Local Node Contexts for Text Generation from Knowledge Graphs
Leonardo F. R. Ribeiro, Yue Zhang, Claire Gardent, Iryna Gurevych,
Scalable Multi-Hop Relational Reasoning for Knowledge-Aware Question Answering
Yanlin Feng, Xinyue Chen, Bill Yuchen Lin, Peifeng Wang, Jun Yan, Xiang Ren,