A Novel Neural Network Model for Joint POS Tagging and Graph-based Dependency Parsing

05/16/2017
by   Dat Quoc Nguyen, et al.
0

We present a novel neural network model that learns POS tagging and graph-based dependency parsing jointly. Our model uses bidirectional LSTMs to learn feature representations shared for both POS tagging and dependency parsing tasks, thus handling the feature-engineering problem. Our extensive experiments, on 19 languages from the Universal Dependencies project, show that our model outperforms the state-of-the-art neural network-based Stack-propagation model for joint POS tagging and transition-based dependency parsing, resulting in a new state of the art. Our code is open-source and available together with pre-trained models at: https://github.com/datquocnguyen/jPTDP

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/11/2018

An improved neural network model for joint POS tagging and dependency parsing

We propose a novel neural network model for joint part-of-speech (POS) t...
research
04/22/2017

Deep Multitask Learning for Semantic Dependency Parsing

We present a deep neural architecture that parses sentences into three s...
research
03/15/2017

SyntaxNet Models for the CoNLL 2017 Shared Task

We describe a baseline dependency parsing system for the CoNLL2017 Share...
research
03/06/2020

Is POS Tagging Necessary or Even Helpful for Neural Dependency Parsing?

In the pre deep learning era, part-of-speech tags have been considered a...
research
08/12/2021

Combining (second-order) graph-based and headed span-based projective dependency parsing

Graph-based methods are popular in dependency parsing for decades. Recen...
research
09/11/2021

COMBO: State-of-the-Art Morphosyntactic Analysis

We introduce COMBO - a fully neural NLP system for accurate part-of-spee...
research
12/15/2020

Jet tagging in the Lund plane with graph networks

The identification of boosted heavy particles such as top quarks or vect...

Please sign up or login with your details

Forgot password? Click here to reset