Abstractive Dialogue Summarization
Abstractive dialogue summarization aims to generate concise, informative summaries of conversations, focusing on key information and discarding irrelevant details. Current research emphasizes improving the accuracy, faithfulness, and coherence of these summaries, often employing transformer-based encoder-decoder models like BART, along with techniques such as graph-based approaches, multi-task learning, and the incorporation of commonsense knowledge. This field is crucial for efficiently processing large volumes of conversational data, with applications ranging from meeting transcription analysis to improving human-computer interaction.
Papers
TANet: Thread-Aware Pretraining for Abstractive Conversational Summarization
Ze Yang, Liran Wang, Zhoujin Tian, Wei Wu, Zhoujun Li
Domain-Oriented Prefix-Tuning: Towards Efficient and Generalizable Fine-tuning for Zero-Shot Dialogue Summarization
Lulu Zhao, Fujia Zheng, Weihao Zeng, Keqing He, Weiran Xu, Huixing Jiang, Wei Wu, Yanan Wu