Sentence Level
Sentence-level analysis in natural language processing focuses on understanding and processing individual sentences within larger texts, aiming to improve various downstream tasks. Current research emphasizes developing robust sentence representations using techniques like multi-task learning, transformer architectures (e.g., RoBERTa), and incorporating both sentence- and token-level objectives to capture finer-grained information. This work is crucial for advancing applications such as machine translation, lexicography, and automated essay scoring, where accurate sentence-level understanding is essential for achieving high performance and improving human-computer interaction.
Papers
Leveraging Professional Radiologists' Expertise to Enhance LLMs' Evaluation for Radiology Reports
Qingqing Zhu, Xiuying Chen, Qiao Jin, Benjamin Hou, Tejas Sudharshan Mathai, Pritam Mukherjee, Xin Gao, Ronald M Summers, Zhiyong Lu
LSTM-based Deep Neural Network With A Focus on Sentence Representation for Sequential Sentence Classification in Medical Scientific Abstracts
Phat Lam, Lam Pham, Tin Nguyen, Hieu Tang, Michael Seidl, Medina Andresel, Alexander Schindler
SD-HuBERT: Sentence-Level Self-Distillation Induces Syllabic Organization in HuBERT
Cheol Jun Cho, Abdelrahman Mohamed, Shang-Wen Li, Alan W Black, Gopala K. Anumanchipalli
xCOMET: Transparent Machine Translation Evaluation through Fine-grained Error Detection
Nuno M. Guerreiro, Ricardo Rei, Daan van Stigt, Luisa Coheur, Pierre Colombo, André F. T. Martins