Do Multilingual Neural Machine Translation Models Contain Language Pair Specific Attention Heads? - Université Grenoble Alpes Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Do Multilingual Neural Machine Translation Models Contain Language Pair Specific Attention Heads?

Résumé

Recent studies on the analysis of the multilingual representations focus on identifying whether there is an emergence of languageindependent representations, or whether a multilingual model partitions its weights among different languages. While most of such work has been conducted in a "black-box" manner, this paper aims to analyze individual components of a multilingual neural translation (NMT) model. In particular, we look at the encoder self-attention and encoder-decoder attention heads (in a many-to-one NMT model) that are more specific to the translation of a certain language pair than others by (1) employing metrics that quantify some aspects of the attention weights such as "variance" or "confidence", and (2) systematically ranking the importance of attention heads with respect to translation quality. Experimental results show that surprisingly, the set of most important attention heads are very similar across the language pairs and that it is possible to remove nearly one-third of the less important heads without hurting the translation quality greatly.
Fichier principal
Vignette du fichier
Do_Multilingual_Neural_Machine_Translation_Models_Contain_Language_Pair_Specific_Cross_Attention_Heads_-2.pdf (4.53 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03299010 , version 1 (25-07-2021)

Identifiants

  • HAL Id : hal-03299010 , version 1

Citer

Zae Myung Kim, Laurent Besacier, Vassilina Nikoulina, Didier Schwab. Do Multilingual Neural Machine Translation Models Contain Language Pair Specific Attention Heads?. Findings of ACL 2021, Aug 2021, Bangkok (virtual), Thailand. ⟨hal-03299010⟩
62 Consultations
33 Téléchargements

Partager

Gmail Facebook X LinkedIn More