Syntax-Aware Graph-to-Graph Transformer for Semantic Role Labelling
| Type of publication: | Conference paper |
| Citation: | Mohammadshahi_REP4NLPATACL2023_2023 |
| Booktitle: | Procceedings of 8th Workshop on Representation Learning for NLP |
| Year: | 2023 |
| Month: | July |
| Crossref: | Mohammadshahi_ARXIV_2021: |
| URL: | https://arxiv.org/abs/2104.077... |
| Abstract: | Recent models have shown that incorporating syntactic knowledge into the semantic role labelling (SRL) task leads to a significant improvement. In this paper, we propose Syntax-aware Graph-to-Graph Transformer (SynG2G-Tr) model, which encodes the syntactic structure using a novel way to input graph relations as embeddings, directly into the self-attention mechanism of Transformer. This approach adds a soft bias towards attention patterns that follow the syntactic structure but also allows the model to use this information to learn alternative patterns. We evaluate our model on both span-based and dependency-based SRL datasets, and outperform previous alternative methods in both in-domain and out-of-domain settings, on CoNLL 2005 and CoNLL 2009 datasets. |
| Keywords: | |
| Authors: | |
| Added by: | [UNK] |
| Total mark: | 0 |
|
Attachments
|
|
|
Notes
|
|
|
|
|