Keywords:
- autoencoders
- Benchmarking
- conditional text generation
- efficient deep learning
- Efficient training scheme
- Hyperparameter optimization
- hyperparameter tuning
- latent space learning
- Natural language processing
- Natural Language Understanding
- optimization
- representation learning
- Sentence embedding
- style transfer
- Text representation learning
- transformers
- variable-size
- word2vec
Publications of Florian Mai sorted by recency
Text Representation Learning for Low Cost Natural Language Understanding, , École polytechnique fédérale de Lausanne, 2023 |
[DOI] [URL] |
HyperMixer: An MLP-based Low Cost Alternative to Transformers, , , , , , and , in: Proc. of the 61st Annual Meeting of the Association for Computational Linguistics, Association for Computational Linguistics, Toronto, Canada, pages 15632-15654, 2023 |
[DOI] |
HyperConformer: Multi-head HyperMixer for Efficient Speech Recognition, , , and , in: Proc. Interspeech 2023, Ireland, 2023 |
|
Bag-of-Vectors Autoencoders for Unsupervised Conditional Text Generation, and , in: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), Association for Computational Linguistics, Online, pages 468–488, 2022 |
[URL] |
HyperMixer: An MLP-based Green AI Alternative to Transformers, , , , , , and , in: arxiv, 2022 |
Bag-of-Vectors Autoencoders for Unsupervised Conditional Text Generation, and , Idiap-RR-21-2021 |
Plug and Play Autoencoders for Conditional Text Generation, , , , and , in: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, Online, 2020 |
|
Plug and Play Autoencoders for Conditional Text Generation, , , , and , Idiap-RR-24-2020 |
|
Optimizer Benchmarking Needs to Account for Hyperparameter Tuning, , , , and , in: Proceedings of the 37th International Conference on Machine Learning, Vienna, Austria, 2020 |
[URL] |
Learning Entailment-Based Sentence Embeddings from Natural Language Inference, , and , Idiap-RR-20-2019 |
[URL] |
On the Tunability of Optimizers in Deep Learning, , , , and , Idiap-RR-19-2019 |
[URL] |
CBOW Is Not All You Need: Combining CBOW with the Compositional Matrix Space Model, , and , in: International Conference on Learning Representations, New Orleans, Louisiana, USA, 2019 |
[URL] |
CBOW Is Not All You Need: Combining CBOW with the Compositional Matrix Space Model, , and , Idiap-RR-06-2019 |
[URL] |