Shuaicheng Zhang, Lifu Huang


Extracting temporal relations (e.g., before, after, concurrent) among events is crucial to natural language understanding. Previous studies mainly rely on neural networks to learn effective features or manual-crafted linguistic features for temporal relation extraction, which usually fail when the context between two events is complex or wide. Inspired by the examination of available temporal relation annotations and human-like cognitive procedures, we propose a new Temporal Graph Transformer network to (1) explicitly find the connection between two events from a syntactic graph constructed from one or two continuous sentences, and (2) automatically locate the most indicative temporal cues from the path of the two event mentions as well as their surrounding concepts in the syntactic graph with a new temporal-oriented attention mechanism. Experiments on MATRES and TB-Dense datasets show that our approach significantly outperforms previous state-of-the-art methods on both end-to-end temporal relation extraction and temporal relation classification.


Shuaicheng Zhang

Lifu Huang

Publication Details

Date of publication:
April 19, 2021
Cornell University
Publication note:

Shuaicheng Zhang, Lifu Huang, Qiang Ning: Extracting Temporal Event Relation with Syntactic-Guided Temporal Graph Transformer. CoRR abs/2104.09570 (2021)