UNDERLINE DOI: https://doi.org/10.48448/nsy4-yg49

workshop paper

ACL-IJCNLP 2021

August 02, 2021

Thailand

One Teacher is Enough? Pre-trained Language Model Distillation from Multiple Teachers

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

Please log in to leave a comment

Downloads

Transcript English (automatic)

Next from ACL-IJCNLP 2021

SaRoCo: Detecting Satire in a Novel Romanian Corpus of News Articles
technical paper

SaRoCo: Detecting Satire in a Novel Romanian Corpus of News Articles

ACL-IJCNLP 2021

Radu Tudor LonescuAna-Cristina Rogoz
Ana-Cristina Rogoz and 1 other author

02 August 2021

Similar lecture

One Teacher is Enough? Pre-trained Language Model Distillation from Multiple Teachers
technical paper

One Teacher is Enough? Pre-trained Language Model Distillation from Multiple Teachers

ACL-IJCNLP 2021

Chuhan Wu
Chuhan Wu and 2 other authors

02 August 2021