Relation Extraction
Relation extraction, a core natural language processing task, aims to identify and classify relationships between entities within text. Current research heavily focuses on improving the robustness and efficiency of relation extraction models, particularly using transformer-based architectures like BERT and LLMs, often incorporating techniques like attention mechanisms, graph convolutional networks, and retrieval-augmented generation to handle complex scenarios such as long sentences, ambiguous relations, and low-resource settings. These advancements are crucial for building knowledge graphs, powering information retrieval systems, and enabling more sophisticated applications in diverse fields like biomedical research and document understanding.
Papers
Recall, Retrieve and Reason: Towards Better In-Context Relation Extraction
Guozheng Li, Peng Wang, Wenjun Ke, Yikai Guo, Ke Ji, Ziyu Shang, Jiajun Liu, Zijie Xu
Meta In-Context Learning Makes Large Language Models Better Zero and Few-Shot Relation Extractors
Guozheng Li, Peng Wang, Jiajun Liu, Yikai Guo, Ke Ji, Ziyu Shang, Zijie Xu
Empirical Analysis of Dialogue Relation Extraction with Large Language Models
Guozheng Li, Zijie Xu, Ziyu Shang, Jiajun Liu, Ke Ji, Yikai Guo