logo Idiap Research Institute        
 [BibTeX] [Marc21]
Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement
Type of publication: Journal paper
Citation: Mohammadshahi_TACL_2021
Publication status: Accepted
Journal: Transactions of the Association for Computational Linguistics (2021)
Volume: 9
Year: 2021
Month: March
Pages: 18
Crossref: Mohammadshahi_TACL_2020:
Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement, Mohammadshahi, Alireza and Henderson, James, in: Transactions of the Association for Computational Linguistics(under submission), 2020
URL: https://direct.mit.edu/tacl/ar...
DOI: https://doi.org/10.1162/tacl_a_00358
Abstract: We propose the Recursive Non-autoregressive Graph-to-Graph Transformer architecture (RNGTr) for the iterative refinement of arbitrary graphs through the recursive application of a non-autoregressive Graph-to-Graph Transformer and apply it to syntactic dependency parsing. We demonstrate the power and effectiveness of RNGTr on several dependency corpora, using a refinement model pre-trained with BERT. We also introduce Syntactic Transformer (SynTr), a non-recursive parser similar to our refinement model. RNGTr can improve the accuracy of a variety of initial parsers on 13 languages from the Universal Dependencies Treebanks, English and Chinese Penn Treebanks, and the German CoNLL2009 corpus, even improving over the new state-of-the-art results achieved by SynTr, significantly improving the state-of-the-art for all corpora tested.
Keywords:
Projects Idiap
Intrepid
Authors Mohammadshahi, Alireza
Henderson, James
Added by: [UNK]
Total mark: 0
Attachments
  • Mohammadshahi_TACL_2021.pdf
       (Tacl_paper)
Notes