Rank-Aware Negative Training for Semi-Supervised Text Classification

9Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

Semi-supervised text classification-based par¬adigms (SSTC) typically employ the spirit of self-training. The key idea is to train a deep classifier on limited labeled texts and then it¬eratively predict the unlabeled texts as their pseudo-labels for further training. However, the performance is largely affected by the ac-curacy of pseudo-labels, which may not be significant in real-world scenarios. This pa¬per presents a Rank-aware Negative Training (RNT) framework to address SSTC in learn¬ing with noisy label settings. To alleviate the noisy information, we adapt a reasoning with uncertainty-based approach to rank the unla¬beled texts based on the evidential support received from the labeled texts. Moreover, we propose the use of negative training to train RNT based on the concept that ‘‘the input in¬stance does not belong to the complementary label’’. A complementary label is randomly se¬lected from all labels except the label on-target. Intuitively, the probability of a true label serv¬ing as a complementary label is low and thus provides less noisy information during the training, resulting in better performance on the test data. Finally, we evaluate the proposed so¬lution on various text classification benchmark datasets. Our extensive experiments show that it consistently overcomes the state-of-the-art alternatives in most scenarios and achieves competitive performance in the others. The code of RNT is publicly available on GitHub.

References Powered by Scopus

Mining and summarizing customer reviews

6316Citations
N/AReaders
Get full text

Sentiment analysis and opinion mining

3733Citations
N/AReaders
Get full text

CosFace: Large Margin Cosine Loss for Deep Face Recognition

2255Citations
N/AReaders
Get full text

Cited by Powered by Scopus

RoFormer: Enhanced transformer with Rotary Position Embedding

353Citations
N/AReaders
Get full text

Towards Robust Learning with Noisy and Pseudo Labels for Text Classification

5Citations
N/AReaders
Get full text

BERT-ASC: Auxiliary-sentence construction for implicit aspect learning in sentiment analysis

3Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Murtadha, A., Pan, S., Bo, W., Su, J., Cao, X., Zhang, W., & Liu, Y. (2023). Rank-Aware Negative Training for Semi-Supervised Text Classification. Transactions of the Association for Computational Linguistics, 11, 771–786. https://doi.org/10.1162/tacl_a_00574

Readers over time

‘23‘24‘2502468

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 2

50%

Professor / Associate Prof. 1

25%

Lecturer / Post doc 1

25%

Readers' Discipline

Tooltip

Computer Science 5

83%

Medicine and Dentistry 1

17%

Save time finding and organizing research with Mendeley

Sign up for free
0