Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: JMIR Medical Informatics

Date Submitted: Aug 17, 2020
Date Accepted: Feb 9, 2021

The final, peer-reviewed published version of this preprint can be found here:

Novel Graph-Based Model With Biaffine Attention for Family History Extraction From Clinical Text: Modeling Study

Zhan K, Peng W, Xiong Y, Fu H, Chen Q, Wang X, Tang B

Novel Graph-Based Model With Biaffine Attention for Family History Extraction From Clinical Text: Modeling Study

JMIR Med Inform 2021;9(4):e23587

DOI: 10.2196/23587

PMID: 33881405

PMCID: 8100876

A Novel Graph-based Model with Biaffine Attention for Family History Extraction from Clinical Text: Family History Extraction Modeling Study

  • Kecheng Zhan; 
  • Weihua Peng; 
  • Ying Xiong; 
  • Huhao Fu; 
  • Qingcai Chen; 
  • Xiaolong Wang; 
  • Buzhou Tang

ABSTRACT

Background:

Family history (FH) information, including family members, side of family of family members, living status of family members, observations of family members, etc., plays a significant role in disease diagnosis and treatment. Family member information extraction aims to extract FH information from semi-structured/unstructured text in electronic health records (EHRs), which is a challenging task regarding named entity recognition (NER) and relation extraction (RE), where NE refers to family members, living status and observations, and relation refers to relations between family members and living status, and relations between family members and observations.

Objective:

This study aims to explore the ways to effectively extract family history information from clinical text.

Methods:

Inspired by dependency parsing, we design a novel graph-based schema to represent FH information and introduced deep biaffine attention to extract FH information in clinical text. In the deep biaffine attention model, we use CNN-BiLSTM (Convolutional Neural Network-Bidirectional Long Short Term Memory network) and BERT (Bidirectional Encoder Representation from Transformers) to encode input sentences, and deployed biaffine classifier to extract FH information. In addition, we also develop a post-processing module to adjust results. A system based on the proposed method was developed for the 2019 n2c2/OHNLP shared task track on FH information extraction, which includes two subtasks on entity recognition and relation extraction respectively.

Results:

We conduct experiments on the corpus provided by the 2019 n2c2/OHNLP shared task track on FH information extraction. Our system achieved the highest F1-scores of 0.8823 on subtask 1 and 0.7048 on subtask 2, respectively, new benchmark results on the 2019 n2c2/OHNLP corpus.

Conclusions:

This study designed a novel Schema to represent FH information using graph and applied deep biaffine attention to extract FH information. Experimental results show the effectiveness of deep biaffine attention on FH information extraction.


 Citation

Please cite as:

Zhan K, Peng W, Xiong Y, Fu H, Chen Q, Wang X, Tang B

Novel Graph-Based Model With Biaffine Attention for Family History Extraction From Clinical Text: Modeling Study

JMIR Med Inform 2021;9(4):e23587

DOI: 10.2196/23587

PMID: 33881405

PMCID: 8100876

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.

Advertisement