Update cookies preferences
 logo Idiap Research Institute        
 [BibTeX] [Marc21]
Latent Space Factorization in LoRA
Type of publication: Conference paper
Citation: Kumar_NEURIPS2025_2025
Publication status: Accepted
Booktitle: 39th Conference on Neural Information Processing Systems
Year: 2025
Month: December
URL: https://arxiv.org/abs/2510.196...
Abstract: Low-rank adaptation (LoRA) is a widely used method for parameter-efficient finetuning. However, existing LoRA variants lack mechanisms to explicitly disambiguate task-relevant information within the learned low-rank subspace, potentially limiting downstream performance. We propose Factorized Variational Autoencoder LoRA (FVAE-LoRA), which leverages a VAE to learn two distinct latent spaces. Our novel Evidence Lower Bound formulation explicitly promotes factorization between the latent spaces, dedicating one latent space to task-salient features and the other to residual information. Extensive experiments on text, audio, and image tasks demonstrate that FVAE-LoRA consistently outperforms standard LoRA. Moreover, spurious correlation evaluations confirm that FVAE-LoRA better isolates task-relevant signals, leading to improved robustness under distribution shifts. Our code is publicly available at: \url{https://github.com/idiap/FVAE-LoRA}
Main Research Program: Human-AI Teaming
Additional Research Programs: AI for Everyone
Keywords: fvae-lora, latent space factorization, LoRA, low-rank adaptation, spurious correlation robustness
Projects: UNIPHORE
ELOQUENCE
ChaSpeePro
Authors: Kumar, Shashi
Kaloga, Yacouba
Mitros, John
Motlicek, Petr
Kodrasi, Ina
Added by: [UNK]
Total mark: 0
Attachments
  • Kumar_NEURIPS2025_2025.pdf
Notes