A Method for Classifying Ancient Poetry Emotions Based on Pre-trained Models and Attention Mechanisms
DOI:
https://doi.org/10.54097/e7qw7v83Keywords:
Sentiment Analysis, Classical Chinese poetry and prose sentiment analysis, Pre-trained models, Attention mechanism.Abstract
Chinese classical poetry is a treasure of Chinese culture with rich and subtle emotional connotations. Studying the realization of automated emotional analysis of ancient poetry is of great significance for fields such as digital humanities and literary studies. This paper proposes an emotion classification model based on a pre-trained model and combined with the attention mechanism in response to the deficiencies in emotion classification accuracy and multi-label emotion classification in existing emotion classification models of ancient poetry and prose. This model is based on the RoBERTa-wwm-ext-large pre-trained model. An attention mechanism is introduced for the multi-label classification task to capture the emotional keywords in the verses, a weighted binary cross-entropy loss function was adopted to address the severe category imbalance problem in multi-label tasks. Experimental results show that the method achieves an accuracy of 90.80% for the single-label five-classification task on the THU fine-grained sentimental poetry corpus (FSPC) dataset, surpassing the CDBERT model's result of 51.40. On the Chinese Classical Poetry dataset (CCPD), the multi-label 12-class tasks Micro-F1 and Macro-F1 achieved 99.40% and 99.38% respectively, outperforming the CDBERT model by 89.95% and 74.90% respectively. This study validates the effectiveness of pre-trained models combined with attention mechanisms in sentiment analysis of classical poetry and provides new ideas for sentiment analysis research in classical literature.
Downloads
References
[1] Devlin J, Chang M W, Lee K, et al. BERT: Pre-training of deep bidirectional transformers for language understanding. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), 2019: 4171-4186.
[2] Zhang W, Wang H, Song M, et al. A method of constructing a fine-grained sentiment lexicon for the humanities computing of classical Chinese poetry. Neural Computing and Applications, 2023, 35 (3): 2325-2346.
[3] Yi X, Sun M, Li R, et al. Chinese poetry generation with a working memory model. arXiv preprint arXiv:1809.04306, 2018.
[4] Zhang J, Duan Y, Gu X. Research on emotion analysis of Chinese literati painting images based on deep learning. Frontiers in Psychology, 2021, 12: 723325.
[5] Zeng Z, Li Y. Multi-modal Chinese text emotion metaphor computation based on mutual information and information entropy. ACM Transactions on Asian and Low-Resource Language Information Processing, 2024, 23 (8): 1-18.
[6] Su C, Liu S, Luo C. Misc: A multimodal approach for sentiment classification of classical Chinese poetry. International Conference on Intelligent Computing. Singapore: Springer Nature Singapore, 2023.
[7] Du X, Pei H, Zhang H. Picturized and recited with dialects: A multimodal Chinese representation framework for sentiment analysis of classical Chinese poetry. arXiv preprint arXiv:2505.13210, 2025.
[8] Mikolov T, Chen K, Corrado G, et al. Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781, 2013.
[9] Liu T, Wang F, Chen M. Rethinking tabular data understanding with large language models. arXiv preprint arXiv:2312.16702, 2023.
[10] Wang C, Liu X, Awadallah A H. Cost-effective hyperparameter optimization for large language model generation inference. International Conference on Automated Machine Learning, PMLR, 2023: 21/1-17.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Academic Journal of Science and Technology

This work is licensed under a Creative Commons Attribution 4.0 International License.








