Lecture image placeholder

Premium content

Access to this content requires a subscription. You must be a premium user to view this content.

Monthly subscription - $9.99Pay per view - $4.99Access through your institutionLogin with Underline account
Need help?
Contact us
Lecture placeholder background

ACL-IJCNLP 2021

August 03, 2021

Thailand

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

Pre-trained Language Models (PLMs) have achieved great success on Machine Reading Comprehension (MRC) over the past few years. Although the general language representation learned from large-scale corpora does benefit MRC, the poor support in evidence extraction which requires reasoning across multiple sentences hinders PLMs from further advancing MRC. To bridge the gap between general PLMs and MRC, we present REPT, a REtrieval-based Pre-Training approach. In particular, we introduce two self-supervised tasks to strengthen evidence extraction during pre-training, which is further inherited by downstream MRC tasks through the consistent retrieval operation and model architecture. To evaluate our proposed method, we conduct extensive experiments on five MRC datasets that require collecting evidence from and reasoning across multiple sentences. Experimental results demonstrate the effectiveness of our pre-training approach. Moreover, further analysis shows that our approach is able to enhance the capacity of evidence extraction without explicit supervision.

Downloads

Paper

Next from ACL-IJCNLP 2021

DeepRapper: Neural Rap Generation with Rhyme and Rhythm Modeling
technical paper

DeepRapper: Neural Rap Generation with Rhyme and Rhythm Modeling

ACL-IJCNLP 2021

Lanqing Xue
Lanqing Xue

03 August 2021

Similar lecture

Revisiting the Negative Data of Distantly Supervised Relation Extraction
technical paper

Revisiting the Negative Data of Distantly Supervised Relation Extraction

ACL-IJCNLP 2021

+3
Chenhao Xie and 5 other authors

02 August 2021