Is Graph Structure Necessary for Multi-hop Question Answering?
Nan Shao, Yiming Cui, Ting Liu, Shijin Wang, Guoping Hu
Question Answering Short Paper
You can open the pre-recorded video in a separate window.
Abstract:
Recently, attempting to model texts as graph structure and introducing graph neural networks to deal with it has become a trend in many NLP research areas. In this paper, we investigate whether the graph structure is necessary for textual multi-hop reasoning. Our analysis is centered on HotpotQA. We construct a strong baseline model to establish that, with the proper use of pre-trained models, graph structure may not be necessary for textual multi-hop reasoning. We point out that both graph structure and adjacency matrix are task-related prior knowledge, and graph-attention can be considered as a special case of self-attention. Experiments demonstrate that graph-attention or the entire graph structure can be replaced by self-attention or Transformers.
NOTE: Video may display a random order of authors.
Correct author list is at the top of this page.