logo Idiap Research Institute        
 [BibTeX] [Marc21]
A VAE for Transformers with Nonparametric Variational Information Bottleneck
Type of publication: Conference paper
Citation: Henderson_ICLR_2023
Publication status: Accepted
Booktitle: The Eleventh International Conference on Learning Representations
Year: 2023
URL: https://openreview.net/forum?i...
Abstract: We propose a Variational AutoEncoder (VAE) for Transformers by developing a Variational Information Bottleneck (VIB) regulariser for Transformer embeddings. We formalise such attention-based representations as mixture distributions, and use Bayesian nonparametrics to develop a Nonparametric VIB (NVIB) for them. The variable number of mixture components supported by nonparametrics captures the variable number of vectors supported by attention, and exchangeable distributions from nonparametrics capture the permutation invariance of attention. Our Transformer VAE (NVAE) uses NVIB to regularise the information passing from the Transformer encoder to the Transformer decoder. Evaluations of a NVAE, trained on natural language text, demonstrate that NVIB can regularise the number of mixture components in the induced embedding whilst maintaining generation quality and reconstruction capacity.
Keywords: natural language, transformers, VAE, VIB
Projects Idiap
EVOLANG
Authors Henderson, James
Fehr, Fabio
Added by: [UNK]
Total mark: 0
Attachments
  • Henderson_ICLR_2023.pdf
Notes