Lecture image placeholder

Premium content

Access to this content requires a subscription. You must be a premium user to view this content.

Monthly subscription - $9.99Pay per view - $4.99Access through your institutionLogin with Underline account
Need help?
Contact us
Lecture placeholder background
VIDEO DOI: https://doi.org/10.48448/t4n6-ph24

technical paper

COLING 2022

•

October 12, 2022

•

Gyeongju , Korea, Republic of

On the Role of Pre-trained Language Models in Word Ordering: A Case Study with BART

Please log in to leave a comment

Downloads

PaperTranscript English (automatic)

Next from COLING 2022

COIN – An Inexpensive and Strong Baseline for Predicting Out of Vocabulary Word Embeddings
technical paper

COIN – An Inexpensive and Strong Baseline for Predicting Out of Vocabulary Word Embeddings

COLING 2022

+2Eduard DragutZhijia Chen
Zhijia Chen and 4 other authors

12 October 2022

Similar lecture

EDITOR: an Edit-Based Transformer with Repositioning for Neural Machine Translation with Soft Lexical Constraints
technical paper

EDITOR: an Edit-Based Transformer with Repositioning for Neural Machine Translation with Soft Lexical Constraints

IJCNLP-AACL 2021

Weijia Xu
Weijia Xu and 1 other author

02 August 2021