Systematic Literature Review and Bibliometric Analysis on Addressing the Vanishing Gradient Issue in Deep Neural Networks for Text Data

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The feature to learn complex text representations enabled by Deep Neural Networks (DNNs) has revolutionized Natural Language Processing and several other fields. However, DNNs have not developed beyond all challenges. For instance, the vanishing gradient problem remains a major challenge. This challenge hinders the ability of the system to capture long-term dependencies in text data. This challenge limits the ability to understand context, implied meanings, semantics, and to represent intricate patterns in text. This study aims to address the prevalent vanishing gradient problem encountered in DNNs when dealing with text data. Text data’s inherent sparsity and heterogeneity exacerbate this issue, increasing computational complexities and processing time. To tackle this problem comprehensively, we will explore existing literature and conduct a bibliometric analysis to identify potential solutions. The findings will contribute to a comprehensive review of the existing literature and suggest effective strategies for mitigating the vanishing gradient problem in the context of NLP tasks. Ultimately, our study will pave the way for further advancements in this area of research.

Cite

CITATION STYLE

APA

Haroon-Sulyman, S. O., Taiye, M. A., Kamaruddin, S. S., & Ahmad, F. K. (2024). Systematic Literature Review and Bibliometric Analysis on Addressing the Vanishing Gradient Issue in Deep Neural Networks for Text Data. In Communications in Computer and Information Science (Vol. 2001 CCIS, pp. 168–181). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-981-99-9589-9_13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free