Application of self-supervised learning in natural language processing

Authors

  • Ye Zhang

DOI:

https://doi.org/10.54097/urpv6i8g3j

Keywords:

Self-supervised learning, Natural language processing, Pre-training language model, Text classification, Emotion analysis

Abstract

Self-supervised learning uses the label-free data learning model and has a significant impact on the NLP task. It reduces data annotation costs and improves performance. The main applications include pre-training models such as BERT and GPT, contrast learning, and pseudo-supervised and semi-supervised methods. It has been successfully applied in text classification, emotion analysis and other fields. Future research directions include mixed unsupervised learning, cross-modal learning and improving interpretability of models while focusing on ethical social issues.

References

Zhou Xin. Study of the application of semi-supervised algorithms in natural language processing [D]. Harbin Institute of Technology [2023-12-27].

Bai Yishan, Huang Zhanyuan. Application of semi-supervised algorithms in natural language processing [J]. Electronic Technology and Software Engineering, 2017 (2): 1.

Huang Chun. Research on the application of semi-supervised algorithms in natural language processing [J]. Science and Technology Innovation Guide, 2019,16 (6): 2.

Yuan Jun. A Review of the Application of Transfer Learning in Natural Language Processing [J]. Science and Education Guide: Electronic edition, 2021.

Tiandong, Zhang Xining. Implementation of a weakly supervised knowledge acquisition system based on natural language processing [J]. Foreign electronic measurement technology, 2017,36 (3): 4.

Jiang Zhuo ren, Chen Yan, Gao Liangcai, etc. A dynamic topic model combining supervised learning [C] / / Ccf International Natural Language Processing and Chinese Computing Conference. 2014.

Liu Chu. City identification and research of implicit chapter relations based on semi-supervised learning [J]. Implicit chapter relation identification [2023-12-27].

Hu Penglong. Cross-language part-of-word annotation study based on semi-supervised structured learning [D]. Harbin Institute of Technology [2023-12-27].

Wu Ke. Chinese natural language processing based on deep learning [D]. Southeast University, 2014.

Chen Weile. Unsupervised named entity recognition based on cross-language pre-training model [J]. [2023-12-27].

Downloads

Published

28-02-2024

Issue

Section

Articles

How to Cite

Zhang, Y. (2024). Application of self-supervised learning in natural language processing. Journal of Computing and Electronic Information Management, 12(1), 23-26. https://doi.org/10.54097/urpv6i8g3j

Similar Articles

1-10 of 134

You may also start an advanced similarity search for this article.