Reusing Keywords for Fine-grained Representations and Matchings

10/21/2022
by   Li Chong, et al.
0

Question retrieval aims to find the semantically equivalent questions for a new question, suffering from a key challenge – lexical gap. Previous solutions mainly focus on the translation model, topic model and deep learning techniques. Distinct from the previous solutions, we propose new insights of reusing important keywords to construct fine-grained semantic representations of questions and then fine-grained matchings for estimating the semantic similarity of two questions. Accordingly, we design a fine-grained matching network by reusing the important keywords. In the network, two cascaded units are proposed: (i) fine-grained representation unit, which uses multi-level keyword sets to represent question semantics of different granularity; (ii) fine-grained matching unit, which first generates multiple comparable representation pairs for two questions, i.e., keyword set pairs, and then matches the two questions from multiple granularities and multiple views by using the comparable representation pairs, i.e., from global matching to local matching and from lexical matching to semantic matching. To get the multi-level keyword sets of a question, we propose a cross-task weakly supervised extraction model that applies question-question labeled signals from the training set of question retrieval to supervise the keyword extraction process. To construct the comparable keyword set pairs, we design a pattern-based assignment method to construct the comparable keyword set pairs from the multi-level keyword sets of two questions. We conduct extensive experiments on three public datasets and the experimental results show that our proposed model outperforms the state-of-the-art solutions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset