Training a Neural Network in a Low-Resource Setting on Automatically Annotated Noisy Data

07/02/2018
by   Michael A. Hedderich, et al.
0

Manually labeled corpora are expensive to create and often not available for low-resource languages or domains. Automatic labeling approaches are an alternative way to obtain labeled data in a quicker and cheaper way. However, these labels often contain more errors which can deteriorate a classifier's performance when trained on this data. We propose a noise layer that is added to a neural network architecture. This allows modeling the noise and train on a combination of clean and noisy data. We show that in a low-resource NER task we can improve performance by up to 35 handling the noise.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset