Dual supervised learning for natural language understanding and generation

17Citations
Citations of this article
198Readers
Mendeley users who have this article in their library.

Abstract

Natural language understanding (NLU) and natural language generation (NLG) are both critical research topics in the NLP and dialogue fields. Natural language understanding is to extract the core semantic meaning from the given utterances, while natural language generation is opposite, of which the goal is to construct corresponding sentences based on the given semantics. However, such dual relationship has not been investigated in literature. This paper proposes a novel learning framework for natural language understanding and generation on top of dual supervised learning, providing a way to exploit the duality. The preliminary experiments show that the proposed approach boosts the performance for both tasks, demonstrating the effectiveness of the dual relationship.1.

References Powered by Scopus

Learning phrase representations using RNN encoder-decoder for statistical machine translation

11632Citations
N/AReaders
Get full text

Semantically conditioned lstm-based Natural language generation for spoken dialogue systems

652Citations
N/AReaders
Get full text

A network-based end-to-end trainable task-oriented dialogue system

562Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Evaluating the state-of-the-art of End-to-End Natural Language Generation: The E2E NLG challenge

152Citations
N/AReaders
Get full text

Recent advances in deep learning based dialogue systems: a systematic survey

119Citations
N/AReaders
Get full text

Leveraging Code Generation to Improve Code Retrieval and Summarization via Dual Learning

60Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Su, S. Y., Huang, C. W., & Chen, Y. N. (2020). Dual supervised learning for natural language understanding and generation. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (pp. 5472–5477). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p19-1545

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 68

70%

Researcher 17

18%

Professor / Associate Prof. 6

6%

Lecturer / Post doc 6

6%

Readers' Discipline

Tooltip

Computer Science 89

79%

Linguistics 11

10%

Engineering 9

8%

Business, Management and Accounting 4

4%

Save time finding and organizing research with Mendeley

Sign up for free