logo Idiap Research Institute        
 [BibTeX] [Marc21]
Cross-lingual Transfer for News Article Labeling: Benchmarking Statistical and Neural Models
Type of publication: Idiap-RR
Citation: Mrini_Idiap-RR-26-2017
Number: Idiap-RR-26-2017
Year: 2017
Month: 9
Institution: Idiap
Address: Rue Marconi 19, CH-1920 Martigny
Note: Report of EPFL semester project done by Khalil Mrini (1st year I&C MSc student), supervised by N. Pappas and A. Popescu-Belis.
Abstract: Cross-lingual transfer has been shown to increase the performance of a text classification model thanks to the use of Multilingual Hierarchical Attention Networks (MHAN), on which this work is based. Firstly, we compared the performance of monolingual and mulitilingual HANs with three types of bag-of-words models. We found that the Binary Unigram model outperforms the HAN model with Dense encoders on the full vocabulary in 6 out of 8 languages, and ties against MHAN with the Dense encoders, when it uses the full vocabulary i.e.~many more parameters than neural models. However, this is not true when we limit the number of parameters and (or) we increase the sophistication of the neural encoders to GRU or biGRU. Secondly, new configurations of parameter sharing were tested. We found that sharing attention at the sentence level was the best configuration by a small margin when transferring from 5 out of 7 languages to English, as well as for cross-lingual transfer between English and Spanish, Russian, and Arabic. The tests were performed on the Deutsche Welle news corpus with 8 languages and 600k documents.
Keywords: document labeling, multilingual hierarchical networks
Projects Idiap
SUMMA
Authors Mrini, Khalil
Pappas, Nikolaos
Popescu-Belis, Andrei
Added by: [ADM]
Total mark: 0
Attachments
  • Mrini_Idiap-RR-26-2017.pdf
Notes