Discourse Aware
Discourse-aware research focuses on improving natural language processing (NLP) models' understanding and generation of text by explicitly considering the relationships between sentences and utterances within a larger context, going beyond sentence-level analysis. Current research emphasizes incorporating discourse structures into various model architectures, including transformers and graph neural networks, often through pre-training on large corpora or using in-context learning with carefully selected examples. This work is significant because it addresses limitations of existing NLP systems in handling complex conversational and document-level tasks, leading to improvements in applications such as machine translation, question answering, and sentiment analysis.