Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP

09/16/2020
by   Hao Fei, et al.
0

Syntax has been shown useful for various NLP tasks, while existing work mostly encodes singleton syntactic tree using one hierarchical neural network. In this paper, we investigate a simple and effective method, Knowledge Distillation, to integrate heterogeneous structure knowledge into a unified sequential LSTM encoder. Experimental results on four typical syntax-dependent tasks show that our method outperforms tree encoders by effectively integrating rich heterogeneous structure syntax, meanwhile reducing error propagation, and also outperforms ensemble methods, in terms of both the efficiency and accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2021

Disentangling Semantics and Syntax in Sentence Embeddings with Pre-trained Language Models

Pre-trained language models have achieved huge success on a wide range o...
research
09/06/2018

Top-down Tree Structured Decoding with Syntactic Connections for Neural Machine Translation and Parsing

The addition of syntax-aware decoding in Neural Machine Translation (NMT...
research
02/07/2022

Measuring and Reducing Model Update Regression in Structured Prediction for NLP

Recent advance in deep learning has led to rapid adoption of machine lea...
research
04/30/2020

Representations of Syntax [MASK] Useful: Effects of Constituency and Dependency Structure in Recursive LSTMs

Sequence-based neural networks show significant sensitivity to syntactic...
research
10/23/2018

Neural Transition-based Syntactic Linearization

The task of linearization is to find a grammatical order given a set of ...
research
06/14/2021

CoDERT: Distilling Encoder Representations with Co-learning for Transducer-based Speech Recognition

We propose a simple yet effective method to compress an RNN-Transducer (...
research
05/07/2021

Regression Bugs Are In Your Model! Measuring, Reducing and Analyzing Regressions In NLP Model Updates

Behavior of deep neural networks can be inconsistent between different v...

Please sign up or login with your details

Forgot password? Click here to reset