Coreference Information
Coreference information, the identification of textual mentions referring to the same real-world entity, is crucial for natural language understanding, particularly in complex contexts like long documents and dialogues. Current research focuses on improving coreference resolution across diverse languages and domains, employing techniques like multi-stage and single-stage models, often leveraging pre-trained language models and incorporating external knowledge bases or visual information to enhance accuracy. These advancements are significant for improving question answering, information retrieval, and other NLP tasks, as well as for applications in areas like analyzing media bias and literary texts.
Papers
Bridging Context Gaps: Leveraging Coreference Resolution for Long Contextual Understanding
Yanming Liu, Xinyue Peng, Jiannan Cao, Shi Bo, Yanxin Shen, Xuhong Zhang, Sheng Cheng, Xun Wang, Jianwei Yin, Tianyu Du
Unifying the Scope of Bridging Anaphora Types in English: Bridging Annotations in ARRAU and GUM
Lauren Levine, Amir Zeldes
Are Large Language Models Robust Coreference Resolvers?
Nghia T. Le, Alan Ritter
Arukikata Travelogue Dataset with Geographic Entity Mention, Coreference, and Link Annotation
Shohei Higashiyama, Hiroki Ouchi, Hiroki Teranishi, Hiroyuki Otomo, Yusuke Ide, Aitaro Yamamoto, Hiroyuki Shindo, Yuki Matsuda, Shoko Wakamiya, Naoya Inoue, Ikuya Yamada, Taro Watanabe