CONF
Parida_MMTLRL-2021_2021/IDIAP
Multimodal Neural Machine Translation System for English to Bengali
Parida, Shantipriya
Panda, Subhadarshi
Biswal, Satya Prakash
Kotwal, Ketan
Sen, Arghyadeep
Dash, Satya Ranjan
Motlicek, Petr
https://publications.idiap.ch/index.php/publications/showcite/Parida_Idiap-RR-13-2021
Related documents
Proceedings of the First Workshop on Multimodal Machine Translation for Low Resource Languages (MMTLRL 2021)
Online (Virtual Mode)
2021
INCOMA Ltd.
31--39
https://aclanthology.org/2021.mmtlrl-1.6
URL
Multimodal Machine Translation (MMT) systems utilize additional information from other modalities beyond text to improve the quality of machine translation (MT). The additional modality is typically in the form of images. Despite proven advantages, it is indeed difficult to develop an MMT system for various languages primarily due to the lack of a suitable multimodal dataset. In this work, we develop an MMT for English-> Bengali using a recently published Bengali Visual Genome (BVG) dataset that contains images with associated bilingual textual descriptions. Through a comparative study of the developed MMT system vis-a-vis a Text-to-text translation, we demonstrate that the use of multimodal data not only improves the translation performance improvement in BLEU score of +1.3 on the development set, +3.9 on the evaluation test, and +0.9 on the challenge test set but also helps to resolve ambiguities in the pure text description. As per best of our knowledge, our English-Bengali MMT system is the first attempt in this direction, and thus, can act as a baseline for the subsequent research in MMT for low resource languages.
REPORT
Parida_Idiap-RR-13-2021/IDIAP
Multimodal Neural Machine Translation System for English to Bengali
Parida, Shantipriya
Panda, Subhadarshi
Biswal, Satya Prakash
Kotwal, Ketan
Sen, Arghyadeep
Dash, Satya Ranjan
Motlicek, Petr
Low resource language
Machine Translation
Multimodal machine translation
EXTERNAL
https://publications.idiap.ch/attachments/reports/2021/Parida_Idiap-RR-13-2021.pdf
PUBLIC
Idiap-RR-13-2021
2021
Idiap
September 2021
Multimodal Machine Translation (MMT) systems utilize additional information from other modalities beyond text to improve the quality of machine translation (MT). The additional modality is typically in the form of images. Despite proven advantages, it is indeed difficult to develop an MMT system for various languages primarily due to the lack of a suitable multimodal dataset. In this work, we develop an MMT for English-> Bengali using a recently published Bengali Visual Genome (BVG) dataset that contains images with associated bilingual textual description. Through a comparative study of the developed MMT system vis-a-vis a Text-to-text translation, we demonstrate that the use of multimodal data not only improves the translation performance improvement in BLEU score of +1.3 on the development set, +3.9 on the evaluation test, and +0.9 on the challenge test set but also helps to resolve ambiguities in the pure text description. As per best of our knowledge, our English-Bengali MMT system is the first attempt in this direction, and thus, can act as a baseline for the subsequent research in MMT for low resource languages.