Update cookies preferences
 logo Idiap Research Institute        
 [BibTeX] [Marc21]
Fine-Tuning Pretrained Models with NVIB for Improved Generalisation
Type of publication: Conference paper
Citation: Fehr_ICLR_2025
Publication status: Accepted
Booktitle: Workshop on Spurious Correlation and Shortcut Learning: Foundations and Solutions
Year: 2025
URL: https://openreview.net/forum?i...
Abstract: Fine-tuned pretrained attention-based models often struggle with generalisation, leading to poor performance on tasks like out-of-domain transfer, distribution shifts, and few-shot learning. This limitation is prevalent across modalities such as speech, text, graphs, and vision. Nonparametric Variational Information Bottleneck (NVIB) is an attention-based information-theoretic regulariser applicable to pretrained models that has been shown to improve generalisation. However, prior work has applied NVIB only to the text modality and without fine-tuning. We investigate whether NVIB’s ability to remove information from pretrained embeddings helps the model avoid spurious correlations with noisy and superficial features during fine-tuning. We are the first to integrate NVIB regularisation during fine-tuning across multiple diverse models and modalities. This required modifications to the architecture which enhance adaptability and stability during fine-tuning and simplify the evaluation. We found improved out-of-distribution generalisation in: speech quality assessment and language identification, text with induced attention sparsity, graph-based link prediction, and few-shot image classification.
Keywords:
Projects: Idiap
EVOLANG
Authors: Fehr, Fabio
Baia, Alina Elena
Chang, Xiaoguang
Coman, Andrei Catalin
El Hajal, Karl
El Zein, Dina
Kumar, Shashi
Zuluaga-Gomez, Juan
Cavallaro, Andrea
Teney, Damien
Henderson, James
Added by: [UNK]
Total mark: 0
Attachments
    Notes