VIDEO DOI: https://doi.org/10.48448/g7wn-7125

findings / work in progress

ACL 2022

May 25, 2022

Dublin, Ireland

Do Pre-trained Models Benefit Knowledge Graph Completion? A Reliable Evaluation and a Reasonable Approach

Please log in to leave a comment

Downloads

SlidesPaperTranscript English (automatic)

Next from ACL 2022

Document-Level Relation Extraction with Adaptive Focal Loss and Knowledge Distillation
findings / work in progress

Document-Level Relation Extraction with Adaptive Focal Loss and Knowledge Distillation

ACL 2022

+1Ruidan HeQingyu Tan
Qingyu Tan and 3 other authors

25 May 2022

Similar lecture

A Comparative Study of Pre-trained Encoders for Low-Resource Named Entity Recognition
workshop paper

A Comparative Study of Pre-trained Encoders for Low-Resource Named Entity Recognition

ACL 2022

+2Arne BinderYuxuan Chen
Yuxuan Chen and 4 other authors

26 May 2022