Second-Order Neural Dependency Parsing with Message Passing and End-to-End Training

10/10/2020
by   Xinyu Wang, et al.
0

In this paper, we propose second-order graph-based neural dependency parsing using message passing and end-to-end neural networks. We empirically show that our approaches match the accuracy of very recent state-of-the-art second-order graph-based neural dependency parsers and have significantly faster speed in both training and testing. We also empirically show the advantage of second-order parsing over first-order parsing and observe that the usefulness of the head-selection structured constraint vanishes when using BERT embedding.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset