Vol.66, No.1, 2021, pp.179-193, doi:10.32604/cmc.2020.011969
ACLSTM: A Novel Method for CQA Answer Quality Prediction Based on Question-Answer Joint Learning
  • Weifeng Ma*, Jiao Lou, Caoting Ji, Laibin Ma
School of Information and Electronic Engineering, Zhejiang University of Science and Technology, Hangzhou, 310023, China
* Corresponding Author: Weifeng Ma. Email: mawf@zust.edu.cn
Received 08 June 2020; Accepted 29 June 2020; Issue published 30 October 2020
Given the limitations of the community question answering (CQA) answer quality prediction method in measuring the semantic information of the answer text, this paper proposes an answer quality prediction model based on the question-answer joint learning (ACLSTM). The attention mechanism is used to obtain the dependency relationship between the Question-and-Answer (Q&A) pairs. Convolutional Neural Network (CNN) and Long Short-term Memory Network (LSTM) are used to extract semantic features of Q&A pairs and calculate their matching degree. Besides, answer semantic representation is combined with other effective extended features as the input representation of the fully connected layer. Compared with other quality prediction models, the ACLSTM model can effectively improve the prediction effect of answer quality. In particular, the mediumquality answer prediction, and its prediction effect is improved after adding effective extended features. Experiments prove that after the ACLSTM model learning, the Q&A pairs can better measure the semantic match between each other, fully reflecting the model’s superior performance in the semantic information processing of the answer text.
Answer quality; semantic matching; attention mechanism; community question answering
Cite This Article
W. Ma, J. Lou, C. Ji and L. Ma, "Aclstm: a novel method for cqa answer quality prediction based on question-answer joint learning," Computers, Materials & Continua, vol. 66, no.1, pp. 179–193, 2021.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.