Learning Context-Sensitive Time-Decay Attention for Role-Based Dialogue Modeling

09/05/2018
by   Shang-Yu Su, et al.
0

Spoken language understanding (SLU) is an essential component in conversational systems. Considering that contexts provide informative cues for better understanding, history can be leveraged for contextual SLU. However, most prior work only paid attention to the related content in history utterances and ignored the temporal information. In dialogues, it is intuitive that the most recent utterances are more important than the least recent ones, hence time-aware attention should be in a decaying manner. Therefore, this paper allows the model to automatically learn a time-decay attention function based on the content of each role's contexts, which effectively integrates both content-aware and time-aware perspectives and demonstrates remarkable flexibility to complex dialogue contexts. The experiments on the benchmark Dialogue State Tracking Challenge (DSTC4) dataset show that the proposed role-based context-sensitive time-decay attention mechanisms significantly improve the state-of-the-art model for contextual understanding performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset