UNDERLINE DOI: https://doi.org/10.48448/d768-0436

technical paper

IJCNLP-AACL 2021

August 02, 2021

Thailand

Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer

Please log in to leave a comment

Downloads

SlidesPaperTranscript English (automatic)

Next from IJCNLP-AACL 2021

Taming Pre-trained Language Models with N-gram Representations for Low-Resource Domain Adaptation
technical paper

Taming Pre-trained Language Models with N-gram Representations for Low-Resource Domain Adaptation

IJCNLP-AACL 2021

Shizhe Diao
Shizhe Diao

02 August 2021

Similar lecture

Syntax-Enhanced Pre-trained Model
poster

Syntax-Enhanced Pre-trained Model

IJCNLP-AACL 2021

Zenan Xu
Zenan Xu

02 August 2021