Envisioning Future from the Past: Hierarchical Duality Learning for Multi-Turn Dialogue Generation

3Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we define a widely neglected property in dialogue text, duality, which is a hierarchical property that is reflected in human behaviours in daily conversations: Based on the logic in a conversation (or a sentence), people can infer follow-up utterances (or tokens) based on the previous text, and vice versa. We propose a hierarchical duality learning for dialogue (HDLD) to simulate this human cognitive ability, for generating high quality responses that connect both previous and follow-up dialogues. HDLD utilizes hierarchical dualities at token hierarchy and utterance hierarchy. HDLD maximizes the mutual information between past and future utterances. Thus, even if the future text is invisible during inference, HDLD is capable of estimating future information implicitly based on dialogue history and generates both coherent and informative responses. In contrast to previous approaches that solely utilize future text as auxiliary information to encode during training, HDLD leverages duality to enable interaction between dialogue history and the future. This enhances the utilization of dialogue data, leading to the improvement in both automatic and human evaluation.

References Powered by Scopus

Bidirectional recurrent neural networks

7494Citations
N/AReaders
Get full text

A diversity-promoting objective function for neural conversation models

1522Citations
N/AReaders
Get full text

A neural network approach to context-sensitive generation of conversational responses

586Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Bridge the Gap between Past and Future: Siamese Model Optimization for Context-Aware Document Ranking

0Citations
N/AReaders
Get full text

An Adaptive Contextual Relation Model for Improving Response Generation

0Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Lv, A., Li, J., Xie, S., & Yan, R. (2023). Envisioning Future from the Past: Hierarchical Duality Learning for Multi-Turn Dialogue Generation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 7382–7394). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-long.407

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 3

60%

Lecturer / Post doc 1

20%

Researcher 1

20%

Readers' Discipline

Tooltip

Computer Science 7

78%

Medicine and Dentistry 1

11%

Business, Management and Accounting 1

11%

Save time finding and organizing research with Mendeley

Sign up for free