Lecture image placeholder

Premium content

Access to this content requires a subscription. You must be a premium user to view this content.

Monthly subscription - $9.99Pay per view - $4.99Access through your institutionLogin with Underline account
Need help?
Contact us
Lecture placeholder background
VIDEO DOI: https://doi.org/10.48448/5jvh-6y73

technical paper

COLING 2022

October 12, 2022

Gyeongju , Korea, Republic of

Improving Continual Relation Extraction through Prototypical Contrastive Learning

Please log in to leave a comment

Downloads

PaperTranscript English (automatic)

Next from COLING 2022

Transferring Knowledge from Structure-aware Self-attention Language Model to Sequence-to-Sequence Semantic Parsing
poster

Transferring Knowledge from Structure-aware Self-attention Language Model to Sequence-to-Sequence Semantic Parsing

COLING 2022

Ran Ji
Ran Ji

12 October 2022

Similar lecture

HiCLRE: A Hierarchical Contrastive Learning Framework for Distantly Supervised Relation Extraction
findings / work in progress

HiCLRE: A Hierarchical Contrastive Learning Framework for Distantly Supervised Relation Extraction

ACL 2022

+2
Dongyang Li and 4 other authors

24 May 2022