An Entity based Approach for Extracting Relationship between Chinese Characters
DOI:
https://doi.org/10.54097/ajst.v6i2.9864Keywords:
Chinese relation extraction, Bidirectional GRU, Neural network, Attention mechanism, Word vector.Abstract
Knowledge extraction includes the extraction of entity relationships, which is crucial. Deep learning algorithms are increasingly prevalent in relation extraction tasks when compared to conventional pattern recognition techniques. The majority of current research on remotely supervised methods and kernel functions in relationship extraction strategies for Chinese language cannot disregard the detrimental effects of noisy data in the dataset on the experimental findings. A bidirectional GRU neural network, a two-layer attention mechanism, and a Chinese relationship extraction model are proposed. For the forgetfulness problem, a bidirectional GRU neural network is utilised to fuse the input vectors. Integrating the structural properties of the Mandarin language, word vectors are employed as the input. Sentence-level attention mechanisms are utilised to extract sentence features, and word-level feature information is retrieved from a sentence. About 1300 bits of data are extracted from news websites using a remotely supervised methodology for validation. According to the experimental findings, the neural network model with a two-layer attention mechanism is able to fully utilise all of the feature information in a sentence, and it performs significantly better than the neural network model without an attention mechanism in terms of accuracy and recall rates. Compared to the model without the attention mechanism, the accuracy and recall rates are noticeably greater.
Downloads
References
Lin Y, Shen S, Liu Z, et al. Neural Relation Extraction with Selective Attention over Instances [C] / / Meeting of the Asso-ciation for Computational Linguistics. 2016 : 2124-2133.
Li W, Zhang P, Wei F,et al. A novel feature-based approach to Chinese entity relation extraction [C] / / Meeting of the Association for Computational Linguistics on Human Lan- guage Technologies: Short Papers. Association for Computational Linguistics, 2010 : 89-92.
Che W, Jiang J, Su Z, et al. Improved-Edit-Distance Kernel for Chinese Relation Extraction [C] / / Natural Language Pro- cessing—IJCNLP 2005. Second International Joint Confer- ence, Jeju Island, Korea, October 11 - 13, 2005. Proceedings. 2010 : 132 -137.
Chen Y, Zheng D Q, Zhao T J. Chinese Relation Extraction Based on Deep Belief Nets [J]. Journal of Software, 2012, 23 ( 10) : 2572-2585.
Rong Bohui, Fu Kun, Huang Yu, et al. multi-channel convolutional neural net-based entity relationship extraction [J]. Computer Applications Research, 2017, 34 (3 ): 689
Xu K, FENG Y, Huang S, et al. Semantic Relation Classification via Convolutional Neural Networks with Simple Negative Sampling[J]. Computer ence, 2015, 71(7): 941-9.
Li Z, Yang Z, Shen C, et al. Integrating shortest dependency path and sentence sequence into a deep learning framework for relation extraction in clinical text[J]. BMC Medical Informatics and Decision Making, 2019, 19(S1).
Sawant U, Garg S, Chakrabarti S, et al. Neural architecture for question answering using a knowledge graph and web corpus[J]. Information Retrieval,2019, 22(3-4): 324-349.
Xu K, FENG Y, Huang S, et al. Semantic Relation Classification via Convolutional Neural Networks with Simple Negative Sampling[J]. Computer ence, 2015, 71(7): 941-9.
Erickson G M. An oligopoly model of dynamic advertising competition[J]. European Journal of Operational Research, 2009, 197(1): 374-388.
Li Z, Yang Z, Shen C, et al. Integrating shortest dependency path and sentence sequence into a deep learning framework for relation extraction in clinical text[J]. BMC Medical Informatics and Decision Making, 2019, 19(S1).
Mikolov T, Sutskever I, Kai C, et al. Distributed Representations of Words and Phrases and their Compositionality[J]. Advances in neural information processing systems, 2013, 26.
Mikolov T, Chen K, Corrado G, et al. Efficient Estimation of Word Representations in Vector Space[D]. Computer Science, 2013.









