End-Task Oriented Textual Entailment via Deep Exploring Inter-Sentence Interactions
This work deals with SciTail, a natural entailment problem derived from a multi-choice question answering task. The premises and hypotheses in SciTail were authored independent of each other and independent of the entailment task, which makes it more challenging than other entailment tasks as linguistic variations in it are not limited by the coverage of hand-designed rules or the creativity of crowd-workers. We propose DEISTE (deep exploring inter-sentence interactions for textual entailment) for this entailment task. Given word-to-word interactions between the premise-hypothesis pair (P, H), DEISTE consists of: (i) A parameter-dynamic convolution to make important words in P and H play a dominant role in learnt representations; (ii) A position-aware attentive convolution to encode the representation and position information of the aligned word pairs. Experiments show DEISTEe gets 5 improvement over prior state of the art. Code & model: https://github.com/yinwenpeng/SciTail
READ FULL TEXT