On the Importance of Word and Sentence Representation Learning in Implicit Discourse Relation Classification
Implicit discourse relation classification is one of the most difficult parts in shallow discourse parsing as the relation prediction without explicit connectives requires the language understanding at both the text span level and the sentence level. Previous studies mainly focus on the interactions between two arguments. We argue that a powerful contextualized representation module, a bilateral multi-perspective matching module, and a global information fusion module are all important to implicit discourse analysis. We propose a novel model to combine these modules together. Extensive experiments show that our proposed model outperforms BERT and other state-of-the-art systems on the PDTB dataset by around 8 effectiveness of different modules in the implicit discourse relation classification task and demonstrate how different levels of representation learning can affect the results.
READ FULL TEXT