Joining external context characters to improve chinese word embedding

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In Chinese, a word is usually composed of several characters, the semantic meaning of a word is related to its composing characters and contexts. Previous studies have shown that modeling the characters can benefit learning word embeddings, however, they ignore the external context characters. In this paper, we propose a novel Chinese word embeddings model which considers both internal characters and external context characters. In this way, isolated characters have more relevance and character embeddings contain more semantic information. Therefore, the effectiveness of Chinese word embeddings is improved. Experimental results show that our model outperforms other word embeddings methods on word relatedness computation, analogical reasoning and text classification tasks, and our model is empirically robust to the proportion of character modeling and corpora size.

Author supplied keywords

Cite

CITATION STYLE

APA

Zhang, X., Liu, S., Li, Y., & Liang, W. (2017). Joining external context characters to improve chinese word embedding. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10262 LNCS, pp. 405–415). Springer Verlag. https://doi.org/10.1007/978-3-319-59081-3_48

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free