Novel Representations of Word Embedding Based on the Zolu Function

Journal of Beijing Institute of Technology ›› 2020, Vol. 29 ›› Issue (4) : 526 -530.

PDF (374KB)
Journal of Beijing Institute of Technology ›› 2020, Vol. 29 ›› Issue (4) : 526 -530. DOI: 10.15918/j.jbit1004-0579.20076

Novel Representations of Word Embedding Based on the Zolu Function

Author information +
History +
PDF (374KB)

Abstract

Two learning models, Zolu-continuous bags of words (ZL-CBOW) and Zolu-skip-grams (ZL-SG), based on the Zolu function are proposed. The slope of Relu in word2vec has been changed by the Zolu function. The proposed models can process extremely large data sets as well as word2vec without increasing the complexity. Also, the models outperform several word embedding methods both in word similarity and syntactic accuracy. The method of ZL-CBOW outperforms CBOW in accuracy by 8.43% on the training set of capital-world, and by 1.24% on the training set of plural-verbs. Moreover, experimental simulations on word similarity and syntactic accuracy show that ZL-CBOW and ZL-SG are superior to LL-CBOW and LL-SG, respectively.

Keywords

Zolu function / word embedding / continuous bags of words / word similarity / accuracy

Cite this article

Download citation ▾
null. Novel Representations of Word Embedding Based on the Zolu Function. Journal of Beijing Institute of Technology, 2020, 29(4): 526-530 DOI:10.15918/j.jbit1004-0579.20076

登录浏览全文

4963

注册一个新账户 忘记密码

References

AI Summary AI Mindmap
PDF (374KB)

931

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/