Update cookies preferences
 logo Idiap Research Institute        
 [BibTeX] [Marc21]
Better Semi-supervised Learning for Multi-domain ASR Through Incremental Retraining and Data Filtering
Type of publication: Conference paper
Citation: Carofilis_INTERSPEECH2025_2025
Publication status: Accepted
Booktitle: Interspeech 2025
Year: 2025
Month: August
Pages: 3618--3622
Location: Rotterdam, The Netherlands
ISSN: 2958-1796
URL: https://www.isca-archive.org/i...
DOI: 10.21437/Interspeech.2025-2601
Abstract: Fine-tuning pretrained ASR models for specific domains is challenging when labeled data is scarce. But unlabeled audio and labeled data from related domains are often available. We propose an incremental semi-supervised learning pipeline that first integrates a small in-domain labeled set and an auxiliary dataset from a closely related domain, achieving a relative improvement of 4% over no auxiliary data. Filtering based on multi-model consensus or named entity recognition (NER) is then applied to select and iteratively refine pseudo-labels, showing slower performance saturation compared to random selection. Evaluated on the multi-domain Wow call center and Fisher English corpora, it outperforms single-step fine-tuning. Consensus-based filtering outperforms other methods, providing up to 22.3% relative improvement on Wow and 24.8% on Fisher over single-step fine-tuning with random selection. NER is the second-best filter, providing competitive performance at a lower computational cost.
Main Research Program: Human-AI Teaming
Keywords:
Projects: Idiap
UNIPHORE
Authors: Carofilis, Andrés
Rangappa, Pradeep
Madikeri, Srikanth
Kumar, Shashi
Burdisso, Sergio
Prakash, Jeena
Villatoro-Tello, Esaú
Motlicek, Petr
Sharma, Bidisha
Hacioğlu, Kadri
Venkatesan, Shankar
Vyas, Saurabh
Stolcke, Andreas
Added by: [UNK]
Total mark: 0
Attachments
  • Carofilis_INTERSPEECH2025_2025.pdf
Notes