Lecture image placeholder

Premium content

Access to this content requires a subscription. You must be a premium user to view this content.

Monthly subscription - $9.99Pay per view - $4.99Access through your institutionLogin with Underline account
Need help?
Contact us
Lecture placeholder background
VIDEO DOI: https://doi.org/10.48448/f6gn-hy88

technical paper

COLING 2022

October 12, 2022

Gyeongju , Korea, Republic of

"No, they did not'': Dialogue response dynamics in pre-trained language models

Please log in to leave a comment

Downloads

PaperTranscript English (automatic)

Next from COLING 2022

Knowledge Distillation with Reptile Meta-Learning for Pretrained Language Model Compression
poster

Knowledge Distillation with Reptile Meta-Learning for Pretrained Language Model Compression

COLING 2022

+1Liang-Chih YuJin WangXinge Ma
Xinge Ma and 3 other authors

12 October 2022

Similar lecture

Chase: A Large-Scale and Pragmatic Chinese Dataset for Cross-Database Context-Dependent Text-to-SQL
technical paper

Chase: A Large-Scale and Pragmatic Chinese Dataset for Cross-Database Context-Dependent Text-to-SQL

IJCNLP-AACL 2021

+5Jiaqi Guo
Jiaqi Guo and 7 other authors

02 August 2021