Hybrid Neural Models For Sequence Modelling: The Best Of Three Worlds

09/16/2019
by   Marco Dinarelli, et al.
0

We propose a neural architecture with the main characteristics of the most successful neural models of the last years: bidirectional RNNs, encoder-decoder, and the Transformer model. Evaluation on three sequence labelling tasks yields results that are close to the state-of-the-art for all tasks and better than it for some of them, showing the pertinence of this hybrid architecture for this kind of tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2019

Seq2Biseq: Bidirectional Output-wise Recurrent Neural Networks for Sequence Modelling

During the last couple of years, Recurrent Neural Networks (RNN) have re...
research
05/21/2020

Evaluating Neural Morphological Taggers for Sanskrit

Neural sequence labelling approaches have achieved state of the art resu...
research
07/08/2022

Seasonal Encoder-Decoder Architecture for Forecasting

Deep learning (DL) in general and Recurrent neural networks (RNNs) in pa...
research
02/15/2021

A Koopman Approach to Understanding Sequence Neural Models

We introduce a new approach to understanding trained sequence neural mod...
research
02/10/2023

Distillation of encoder-decoder transformers for sequence labelling

Driven by encouraging results on a wide range of tasks, the field of NLP...
research
06/11/2018

Distributed Evaluations: Ending Neural Point Metrics

With the rise of neural models across the field of information retrieval...

Please sign up or login with your details

Forgot password? Click here to reset