VIDEO DOI: https://doi.org/10.48448/714v-7w58

poster

NAACL 2022

July 13, 2022

Seattle, United States

Por Qué Não Utiliser Alla Språk? Mixed Training with Gradient Optimization in Few-Shot Cross-Lingual Transfer

Please log in to leave a comment

Downloads

PaperTranscript English (automatic)

Next from NAACL 2022

DOCmT5: Document-Level Pretraining of Multilingual Language Models
poster

DOCmT5: Document-Level Pretraining of Multilingual Language Models

NAACL 2022

+1Chia-Hsuan Lee
Chia-Hsuan Lee and 3 other authors

13 July 2022

Similar lecture

Challenges in Measuring Bias via Open-Ended Language Generation
workshop paper

Challenges in Measuring Bias via Open-Ended Language Generation

NAACL 2022

Afra Feyza Akyurek
Afra Feyza Akyurek

14 July 2022