RELATION EXTRACTION METHOD OF CHINESE MEDICAL TEXT BASED ON RSIG-LSTM, 1-12.

Songpu Li, Ruxin Gong, Xiaosheng Yu, and Jiaheng Li

Keywords

LSTM, activation function, relation extraction, Chinese medical text, keyword extraction

Abstract

Purpose–long–short term memory (LSTM) is widely used in relation extraction. Tanh activation function in LSTM faces a vanishing gradient problem, which can hinder the transmission of knowledge and cause errors in the experimental results. Design/methodology/approach– in this paper, we propose a relation extraction method based on RSigELUS-LSTM (RSig-LSTM). Firstly, we use the bidirectional encoder representation from transformers (BERT) network model to embed word information. Secondly, we combine bidirectional RSig-LSTM with an attention mechanism to process features. We use the Softmax classifier to determine the relation type between entities in the Chinese medical text. Findings–Compared with LSTM and other improved LSTM, the precision of RSig-LSTM rose by 0.96%–5.25%, recall of RSig-LSTM rose by 0.25%–5.25%, F1-Score of RSig-LSTM rose by 0.66%–5.29% and time cost of RSig-LSTM has reduced by 6.97%–33.31%. Originality/value–A local dataset is used to reflect people’s physical condition and enhance the practicability of our research. Considering the importance of medical-related entities in medical text research, we use a new formula to calculate the weight of the word in Chinese medical text.

Important Links:



Go Back