Refining the state-of-the-art in Machine Translation, optimizing NMT for the JA <-> EN language pair by leveraging personal domain expertise

02/23/2022
by   Matthew Bieda, et al.
0

Documenting the construction of an NMT (Neural Machine Translation) system for En/Ja based on the Transformer architecture leveraging the OpenNMT framework. A systematic exploration of corpora pre-processing, hyperparameter tuning and model architecture is carried out to obtain optimal performance. The system is evaluated using standard auto-evaluation metrics such as BLEU, and my subjective opinion as a Japanese linguist.

READ FULL TEXT

page 3

page 7

research
04/21/2017

Neural System Combination for Machine Translation

Neural machine translation (NMT) becomes a new approach to machine trans...
research
05/24/2019

A Call for Prudent Choice of Subword Merge Operations

Most neural machine translation systems are built upon subword units ext...
research
10/18/2019

A language processing algorithm for predicting tactical solutions to an operational planning problem under uncertainty

This paper is devoted to the prediction of solutions to a stochastic dis...
research
12/10/2020

Approches quantitatives de l'analyse des prédictions en traduction automatique neuronale (TAN)

As part of a larger project on optimal learning conditions in neural mac...
research
03/28/2021

PENELOPIE: Enabling Open Information Extraction for the Greek Language through Machine Translation

In this paper we present our submission for the EACL 2021 SRW; a methodo...
research
07/24/2018

Otem&Utem: Over- and Under-Translation Evaluation Metric for NMT

Although neural machine translation(NMT) yields promising translation pe...
research
06/14/2018

Morphological and Language-Agnostic Word Segmentation for NMT

The state of the art of handling rich morphology in neural machine trans...

Please sign up or login with your details

Forgot password? Click here to reset