G-T-ERNIE: Multi-Label Classifier with Text–Label Joint Modeling for Tourism Texts
DOI:
https://doi.org/10.54097/b5mdr527Keywords:
ERNIE, GCN, TextCNN, Multi-Label Text Classification, Tourism Review Analysis, Intelligent TourismAbstract
With the rapid growth of user reviews on tourism platforms and the rising demand for intelligent services, multi-label classification has become essential for review retrieval and service optimization. Traditional methods struggle with the complex semantics and multi-label nature of reviews, leading to weak feature representation and lower accuracy. To address this issue, this study proposes a G-T-ERNIE-based multi-label text classification model. ERNIE is used to capture contextual semantics, TextCNN extracts local n-gram features, and a label co-occurrence graph with GCN models dependencies among labels. In addition, label semantic vectors are integrated to enhance label representation and enable joint modeling of text and labels. Experiments on a self-constructed Jingdezhen tourism review dataset show that the model outperforms existing methods in Micro-F1 and Hamming Loss. To further verify generalization, additional tests were conducted on the CAIL2019 Marriage and Family Elements dataset, confirming the model’s effectiveness and robustness. Moreover, three sets of ablation experiments on the Jingdezhen dataset were designed to examine the contribution of the label structure and semantic fusion module, and the results demonstrated the effectiveness of these components. Overall, the findings provide useful insights for intelligent review analysis and service optimization, supporting the digital and intelligent development of the tourism industry.
Downloads
References
[1] Dongmei, L.; Yu, Y.; Xianghao, M.; Xiaoping, Z.; Chao, S.; Yufeng, Z. Review on multi-lable classification. Journal of Frontiers of Computer Science & Technology 2023, 17, 2529.
[2] Kim, Y. Convolutional neural networks for sentence classification. arXiv 2014, arXiv:1408.5882.
[3] Kipf, T. Semi-Supervised Classification with Graph Convolutional Networks. arXiv 2016, arXiv:1609.02907.
[4] Zong, D.; Sun, S. GNN-XML: graph neural networks for extreme multi-label text classification. arXiv 2020, arXiv: 2012. 05860.
[5] Pal, A.; Selvakumar, M.; Sankarasubbu, M. Multi-label text classification using attention-based graph neural network. arXiv 2020, arXiv:2003.11644.
[6] Devlin, J.; Chang, M.-W.; Lee, K.; Toutanova, K. Bert: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies, volume 1 (long and short papers), 2019; pp. 4171-4186.
[7] Liu, Y.; Ott, M.; Goyal, N.; Du, J.; Joshi, M.; Chen, D.; Levy, O.; Lewis, M.; Zettlemoyer, L.; Stoyanov, V. Roberta: A robustly optimized bert pretraining approach. arXiv 2019, arXiv:1907.11692.
[8] Zhang, Z.; Han, X.; Liu, Z.; Jiang, X.; Sun, M.; Liu, Q. ERNIE: Enhanced language representation with informative entities. arXiv 2019, arXiv:1905.07129.
[9] Afzaal, M.; Usman, M.; Fong, A.C.; Fong, S. Multiaspect‐based opinion classification model for tourist reviews. Expert Systems 2019, 36, e12371.
[10] Cheng, Y.; Chen, W. Cultural Perception of Tourism Heritage Landscapes via Multi-Label Deep Learning: A Study of Jingdezhen, the Porcelain Capital. Land 2025, 14, 559.
[11] Wang, G.; Li, C.; Wang, W.; Zhang, Y.; Shen, D.; Zhang, X.; Henao, R.; Carin, L. Joint embedding of words and labels for text classification. arXiv 2018, arXiv:1805.04174.
[12] Xiong, Y.; Feng, Y.; Wu, H.; Kamigaito, H.; Okumura, M. Fusing label embedding into bert: An efficient improvement for text classification. In Proceedings of the Findings of the association for computational linguistics: ACL-IJCNLP 2021, 2021; pp. 1743-1750.
[13] Lu J. Multi-label classification method of open source threat intelligence text based on Bert-TexCNN. Journal of Information Security Research 2024, 10, 760-768; http://www. sicris. cn/CN/Y2024/V10/I8/760.
[14] Yu H; Zhou Y; Zhai M; Liu H. Text classification based on pre-training model and label fusion. Journal of Computer Applications 2024, 44, 709-714; https://www.joca. cn/ CN/ 10. 11772/j.issn.1001-9081.2023030340.
[15] Sun, Y.; Wang, S.; Feng, S.; Ding, S.; Pang, C.; Shang, J.; Liu, J.; Chen, X.; Zhao, Y.; Lu, Y. Ernie 3.0: Large-scale knowledge enhanced pre-training for language understanding and generation. arXiv 2021, arXiv:2107.02137.
[16] Li, Z.; Ren, J. Fine-tuning ERNIE for chest abnormal imaging signs extraction. Journal of Biomedical Informatics 2020, 108, 103492.
[17] Sun, Y.; Yu, Z.; Sun, Y.; Xu, Y.; Song, B. A novel approach for multiclass sentiment analysis on Chinese social media with ERNIE-MCBMA. Scientific Reports 2025, 15, 18675.
[18] Li, B.; Hou, Y.; Dong, J.; Yang, B.; Wang, X. ICRA: A study of highly accurate course recommendation models incorporating false review filtering and ERNIE 3.0. PloS one 2024, 19, e0313928.
[19] Zheng, C.; Wang, X.; Wang, T.; Deng, Y.; Yin, T. Multi-label classification for medical text based on ALBERT-TextCNN model. Journal of Shandong University (Natural Science) 2022, 57, 21-29.
[20] Yan, Y. ERNIE-TextCNN: research on classification methods of Chinese news headlines in different situations. Scientific Reports 2025, 15, 29071.
[21] Meng, C.; Todo, Y.; Tang, C.; Luan, L.; Tang, Z. MFLSCI: Multi-granularity fusion and label semantic correlation information for multi-label legal text classification. Engineering Applications of Artificial Intelligence 2025, 139, 109604.
[22] Xiao, C.; Zhong, H.; Guo, Z.; Tu, C.; Liu, Z.; Sun, M.; Zhang, T.; Han, X.; Hu, Z.; Wang, H. CAIL2019-SCM: a dataset of similar case matching in legal domain. arXiv 2019, arXiv:1911.08962.
[23] Hsieh, Y.-H.; Zeng, X.-P. Sentiment analysis: An ERNIE-BiLSTM approach to bullet screen comments. Sensors 2022, 22, 5223.
[24] Cheng, Q.; Shi, W. Hierarchical multi-label text classification of tourism resources using a label-aware dual graph attention network. Information Processing & Management 2025, 62, 103952.
[25] Cui, Y.; Che, W.; Liu, T.; Qin, B.; Wang, S.; Hu, G. Revisiting pre-trained models for Chinese natural language processing. arXiv 2020, arXiv:2004.13922.
[26] Peng, Y.; Wu, W.; Ren, J.; Yu, X. Novel GCN model using dense connection and attention mechanism for text classification. Neural Processing Letters 2024, 56, 144.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Frontiers in Computing and Intelligent Systems

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

