UNDERLINE DOI: https://doi.org/10.48448/sqvk-y475

technical paper

IJCNLP-AACL 2021

August 03, 2021

Thailand

What Context Features Can Transformer Language Models Use?

Please log in to leave a comment

Downloads

SlidesPaperTranscript English (automatic)

Next from IJCNLP-AACL 2021

Adapting High-resource NMT Models to Translate Low-resource Related Languages without Parallel Data
technical paper

Adapting High-resource NMT Models to Translate Low-resource Related Languages without Parallel Data

IJCNLP-AACL 2021

Wei-Jen Ko
Wei-Jen Ko

03 August 2021

Similar lecture

The Case for Translation-Invariant Self-Attention in Transformer-Based Language Models
technical paper

The Case for Translation-Invariant Self-Attention in Transformer-Based Language Models

IJCNLP-AACL 2021

Ulme Wennberg
Ulme Wennberg

03 August 2021