MIDAS: A Dialog Act Annotation Scheme for Open Domain Human Machine Spoken Conversations

08/27/2019
by   Dian Yu, et al.
0

Dialog act prediction is an essential language comprehension task for both dialog system building and discourse analysis. Previous dialog act schemes, such as SWBD-DAMSL, are designed for human-human conversations, in which conversation partners have perfect language understanding ability. In this paper, we design a dialog act annotation scheme, MIDAS (Machine Interaction Dialog Act Scheme), targeted on open-domain human-machine conversations. MIDAS is designed to assist machines which have limited ability to understand their human partners. MIDAS has a hierarchical structure and supports multi-label annotations. We collected and annotated a large open-domain human-machine spoken conversation dataset (consists of 24K utterances). To show the applicability of the scheme, we leverage transfer learning methods to train a multi-label dialog act prediction model and reach an F1 score of 0.79.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset