Ontology-Aware Token Embeddings for Prepositional Phrase Attachment

05/08/2017
by   Pradeep Dasigi, et al.
0

Type-level word embeddings use the same set of parameters to represent all instances of a word regardless of its context, ignoring the inherent lexical ambiguity in language. Instead, we embed semantic concepts (or synsets) as defined in WordNet and represent a word token in a particular context by estimating a distribution over relevant semantic concepts. We use the new, context-sensitive embeddings in a model for predicting prepositional phrase(PP) attachments and jointly learn the concept embeddings and model parameters. We show that using context-sensitive embeddings improves the accuracy of the PP attachment model by 5.4 reduction in errors.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset