Lecture image placeholder

Premium content

Access to this content requires a subscription. You must be a premium user to view this content.

Monthly subscription - $9.99Pay per view - $4.99Access through your institutionLogin with Underline account
Need help?
Contact us
Lecture placeholder background
VIDEO DOI: https://doi.org/10.48448/03kx-cv86

poster

COLING 2022

October 12, 2022

Gyeongju , Korea, Republic of

Knowledge Distillation with Reptile Meta-Learning for Pretrained Language Model Compression

Please log in to leave a comment

Downloads

PaperTranscript English (automatic)

Next from COLING 2022

Emotion Enriched Retrofitted Word Embeddings
technical paper

Emotion Enriched Retrofitted Word Embeddings

COLING 2022

Sreedhar ReddyPushpak BhattacharyyaSapan Shah
Sapan Shah and 2 other authors

12 October 2022

Similar lecture

Knowledge Distillation for Model-Agnostic Meta-Learning
technical paper

Knowledge Distillation for Model-Agnostic Meta-Learning

ECAI 2020

Zhang Min
Zhang Min

31 August 2020