Hybrid Attention Networks for Chinese Short Text Classification

Yujun Zhou, Jiaming Xu, Jie Cao, Bo Xu, Changliang Li, Bo Xu


To improve the classification performance for Chinese short text with automatic semantic feature selection, in this paper we propose the Hybrid Attention Networks (HANs) which combines the word- and character-level selective attentions. The model firstly applies RNN and CNN to extract the semantic features of texts. Then it captures class-related attentive representation from word- and character-level features. Finally, all of the features are concatenated and fed into the output layer for classification. Experimental results on 32-class and 5-class datasets show that, our model  outperforms multiple baselines by combining not only the word- and character-level features of the texts, but also class-related semantic features by attentive mechanism.


Chinese short texts, text classification, attentive mechanism, convolutional neural network, recurrent neural network

Full Text: PDF