UdS Submission for the WMT 19 Automatic Post-Editing Task

08/09/2019
by   Hongfei Xu, et al.
0

In this paper, we describe our submission to the English-German APE shared task at WMT 2019. We utilize and adapt an NMT architecture originally developed for exploiting context information to APE, implement this in our own transformer model and explore joint training of the APE task with a de-noising encoder.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/01/2018

MS-UEdin Submission to the WMT2018 APE Shared Task: Dual-Source Transformer for Automatic Post-Editing

This paper describes the Microsoft and University of Edinburgh submissio...
research
09/17/2021

The JHU-Microsoft Submission for WMT21 Quality Estimation Shared Task

This paper presents the JHU-Microsoft joint submission for WMT 2021 qual...
research
08/16/2019

The Transference Architecture for Automatic Post-Editing

In automatic post-editing (APE) it makes sense to condition post-editing...
research
07/17/2017

LIG-CRIStAL System for the WMT17 Automatic Post-Editing Task

This paper presents the LIG-CRIStAL submission to the shared Automatic P...
research
05/30/2019

Unbabel's Submission to the WMT2019 APE Shared Task: BERT-based Encoder-Decoder for Automatic Post-Editing

This paper describes Unbabel's submission to the WMT2019 APE Shared Task...
research
08/15/2019

Transformer-based Automatic Post-Editing with a Context-Aware Encoding Approach for Multi-Source Inputs

Recent approaches to the Automatic Post-Editing (APE) research have show...
research
09/12/2021

Levenshtein Training for Word-level Quality Estimation

We propose a novel scheme to use the Levenshtein Transformer to perform ...

Please sign up or login with your details

Forgot password? Click here to reset