Zeyu Xiong1,*, Qiangqiang Shen1, Yueshan Xiong1, Yijie Wang1, Weizi Li2
CMC-Computers, Materials & Continua, Vol.60, No.1, pp. 259-273, 2019, DOI:10.32604/cmc.2019.05155
Abstract Word vector representation is widely used in natural language processing tasks. Most word vectors are generated based on probability model, its bag-of-words features have two major weaknesses: they lose the ordering of the words and they also ignore semantics of the words. Recently, neural-network language models CBOW and Skip-Gram are developed as continuous-space language models for words representation in high dimensional real-valued vectors. These vector representations have recently demonstrated promising results in various NLP tasks because of their superiority in capturing syntactic and contextual regularities in language. In this paper, we propose a new strategy… More >