Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement
Type of publication: | Journal paper |
Citation: | Mohammadshahi_TACL_2020 |
Journal: | Transactions of the Association for Computational Linguistics(under submission) |
Year: | 2020 |
Abstract: | We propose the Recursive Non-autoregressive Graph-to-graph Transformer architecture (RNG-Tr) for the iterative refinement of arbitrary graphs through the recursive application of a non-autoregressive Graph-to-Graph Transformer and apply it to syntactic dependency parsing. The Graph-to-Graph Transformer architecture of \newcite{mohammadshahi2019graphtograph} has previously been used for autoregressive graph prediction, but here we use it to predict all edges of the graph independently, conditioned on a previous prediction of the same graph. We demonstrate the power and effectiveness of RNG-Tr on several dependency corpora, using a refinement model pre-trained with BERT~\cite{devlin2018bert}. We also introduce Dependency BERT (DepBERT), a non-recursive parser similar to our refinement model. RNG-Tr is able to improve the accuracy of a variety of initial parsers on 13 languages from the Universal Dependencies Treebanks and the English and Chinese Penn Treebanks, even improving over the new state-of-the-art results achieved by DepBERT, significantly improving the state-of-the-art for all corpora tested. |
Keywords: | Natural language processing, NLP, Parsing, Transformer |
Projects |
Idiap |
Authors | |
Crossref by |
Mohammadshahi_TACL-2_2020 Mohammadshahi_TACL_2021 |
Added by: | [UNK] |
Total mark: | 0 |
Attachments
|
|
Notes
|
|
|