logo Idiap Research Institute        
 [BibTeX] [Marc21]
BertOdia: BERT pre-training for low resource Odia language
Type of publication: Idiap-RR
Citation: Parida_Idiap-RR-16-2021
Number: Idiap-RR-16-2021
Year: 2021
Month: 10
Institution: Idiap
Note: Accepted at 2nd International Conference on Biologically Inspired Techniques in Many-Criteria Decision Making (BITMDM-2021)
Abstract: Odia language is one of the 30 most spoken languages in the world. It is spoken in the Indian state called Odisha. Odia language lacks online content and resources for natural language processing (NLP)research. There is a great need for a better language model for the low resource Odia language, which can be used for many downstream NLP tasks. In this paper, we introduce a Bert-based language model,pre-trained on 430’000 Odia sentences. We also evaluate the model on the well-known Kaggle Odia news classification dataset (BertOdia:96%, RoBERTaOdia:92%, and ULMFit:91.9% classification accuracy), and perform a comparison study with multilingual Bidirectional EncoderRepresentations from Transformers (BERT) supporting Odia. The model will be released publicly for the researchers to explore other NLP tasks.
Keywords:
Projects Idiap
EC H2020-ROXANNE
Authors Parida, Shantipriya
Biswal, Satya Prakash
Nayak, Biranchi Narayan
Fabien, Mael
Villatoro-Tello, Esaú
Motlicek, Petr
Added by: [ADM]
Total mark: 0
Attachments
  • Parida_Idiap-RR-16-2021.pdf
Notes