Discourse Coherence

Discourse coherence, the logical and meaningful connection between parts of a text, is a crucial aspect of natural language processing, with research focusing on improving its automatic assessment and generation. Current efforts involve developing novel metrics and algorithms, often leveraging large language models (LLMs) and techniques like topic modeling (e.g., BERTopic) and diffusion models, to better capture nuanced semantic relationships and improve the coherence of generated text, particularly in long-form content and multilingual settings. These advancements have significant implications for various applications, including automated writing evaluation, text summarization, and the development of more reliable and human-like AI systems. Improved coherence assessment also benefits fields like healthcare, where trustworthy and factual language models are critical.

Papers