Keywords:
- Analysis of Neural Networks
- attention-based models
- autoencoders
- deep learning
- distributional semantics
- entailment
- evaluation
- generation
- hallucination detection
- Hierarchical Attention Networks
- human annotations
- language modeling
- latent space learning
- LLM
- Machine Translation
- Multilingual Learning
- multilinguality
- natural language
- natural language generation
- Natural language processing
- natural language semantics
- neural machine translation
- neural networks
- NLP
- Nonparametric Variational Information Bottleneck
- Nonparametric VIB
- Out-of-domain generalisation
- output representation learning
- Parsing
- Post-training regularisation
- QA
- Reinterpretation
- representation learning
- style transfer
- Summarization
- Text classification
- Transformer
- transformers
- unsupervised learning
- VAE
- variable-size
- VIB
- word embedding
- word sense disambiguation
- Zero-shot Learning
Publications of James Henderson sorted by first author
| 1 | 2 |
A
Multilingual Extraction and Categorization of Lexical Collocations with Graph-aware Transformers, , , , and , in: Proceedings of the 11th Joint Conference on Lexical and Computational Semantics, Seattle, USA, pages 89–100, 2022 |
|
B
The DCU-EPFL Enhanced Dependency Parser at the IWPT 2021 Shared Task, , , , and , in: Proceedings of the 17th International Conference on Parsing Technologies and the IWPT 2021 Shared Task on Parsing into Enhanced Universal Dependencies, Online, pages 204-212, Association for Computational Linguistics, 2021 |
|
Learning to Abstract with Nonparametric Variational Information Bottleneck, , and , in: The 2023 Conference on Empirical Methods in Natural Language Processing, 2023 |
[URL] |
Inducing Meaningful Units from Character Sequences with Dynamic Capacity Slot Attention, and , in: Transactions on Machine Learning Research, 2023 |
[URL] |
Idiap Scientific Report 2022, , , , , , , , , , , , , , , , , and , Idiap-RR-05-2023 |
|
C
GADePo: Graph-Assisted Declarative Pooling Transformers for Document-Level Relation Extraction, , , and , in: Proceedings of the 3rd Workshop on Knowledge Augmented Methods for NLP, Association for Computational Linguistics, 2024 |
[DOI] [URL] |
F
Nonparametric Variational Regularisation of Pretrained Transformers, and , in: First conference on Language Modelling, 2024 |
[URL] |
Nonparametric Variational Regularisation of Pretrained Transformers, and , in: ArXiv, 2023 |
[DOI] [URL] |
H
The Unstoppable Rise of Computational Linguistics in Deep Learning, , in: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Association for Computational Linguistics, Online, pages 6294-6306, Association for Computational Linguistics, 2020 |
[DOI] [URL] |
A VAE for Transformers with Nonparametric Variational Information Bottleneck, and , in: The Eleventh International Conference on Learning Representations, 2023 |
[URL] |
A Variational AutoEncoder for Transformers with Nonparametric Variational Information Bottleneck, and , in: arxiv, 2022 |
[DOI] [URL] |
Transformers as Graph-to-Graph Models, , , and , in: Big Picture Workshop at EMNLP 2023, 2023 |
K
Variational Information Bottleneck for Effective Low-Resource Fine-Tuning, , and , in: ICLR, 2021 |
|
End-to-End Bias Mitigation by Modelling Biases in Corpora, , and , in: ACL, 2020 |
|
Compacter: Efficient Low-Rank Hypercomplex Adapter Layers, , and , in: NeurIPS, 2021 |
|
Learning Entailment-Based Sentence Embeddings from Natural Language Inference, , and , Idiap-RR-20-2019 |
[URL] |
Parameter-efficient Multi-task Fine-tuning for Transformers via Shared Hypernetworks, , , and , in: ACL, 2021 |
|
PERFECT: Prompt-free and Efficient Few-shot Learning with Language Models, , , , , , and , in: ACL, 2022 |
|
M
Bag-of-Vectors Autoencoders for Unsupervised Conditional Text Generation, and , in: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), Association for Computational Linguistics, Online, pages 468–488, 2022 |
[URL] |
Bag-of-Vectors Autoencoders for Unsupervised Conditional Text Generation, and , Idiap-RR-21-2021 |
HyperMixer: An MLP-based Low Cost Alternative to Transformers, , , , , , and , in: Proc. of the 61st Annual Meeting of the Association for Computational Linguistics, Association for Computational Linguistics, Toronto, Canada, pages 15632-15654, 2023 |
[DOI] |
HyperMixer: An MLP-based Green AI Alternative to Transformers, , , , , , and , in: arxiv, 2022 |
Plug and Play Autoencoders for Conditional Text Generation, , , , and , Idiap-RR-24-2020 |
|
Plug and Play Autoencoders for Conditional Text Generation, , , , and , in: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, Online, 2020 |
|
Unsupervised Token-level Hallucination Detection from Summary Generation By-products, and , in: Workshop on Generation, Evaluation and Metrics (GEM), 2022 |
|
Sentence-level Planning for Especially Abstractive Summarization, and , in: Proceedings of the Third Workshop on New Frontiers in Summarization, pages 1--14, Association for Computational Linguistics, 2021 |
[URL] |
A Corpus and Evaluation for Predicting Semi-Structured Human Annotations, , , , and , in: Workshop on Generation, Evaluation and Metrics (GEM), 2022 |
|
Graph Refinement for Coreference Resolution, and , in: Findings of Association for >Computational Linguistics: ACL 2022, 2022 |
Partially-supervised Mention Detection, and , in: Proceedings of the Third Workshop on Computational Models of Reference, Anaphora and Coreference, 2020 |
|
Document-Level Neural Machine Translation with Hierarchical Attention Networks, , , and , in: Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2018 |
|
Syntax-Aware Graph-to-Graph Transformer for Semantic Role Labelling, and , in: Procceedings of 8th Workshop on Representation Learning for NLP, 2023 |
[URL] |
Syntax-Aware Graph-to-Graph Transformer for Semantic Role Labelling, and , in: Arxiv, 2021 |
|
Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement, and , in: Transactions of the Association for Computational Linguistics (2021), 9:18, 2021 |
[DOI] [URL] |
Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement, and , in: Transactions of the Association for Computational Linguistics(under submission), 2020 |
Graph-to-Graph Transformer for Transition-based Dependency Parsing, and , in: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, Association for Computational Linguistics, 2020 |
[URL] |
Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement, and , in: Transactions of the Association for Computational Linguistics, 2020 |
[URL] |
Graph-to-Graph Transformer for Transition-based Dependency Parsing, and , in: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings, ACL, Online, pages 3278–3289, Association for Computational Linguistics, 2020 |
[URL] |
| 1 | 2 |