Full Text

Research on full-text analysis in scientific literature focuses on leveraging the entirety of research papers—beyond abstracts—to improve various tasks, such as citation prediction, systematic review automation, and understanding document structure. Current approaches utilize transformer-based models like BERT and large language models (LLMs) to process this rich information, often incorporating techniques like hierarchical attention networks to handle the complex structure of scientific articles. This work is significant because it promises to enhance the efficiency and accuracy of scientific discovery, improving citation tracking, accelerating literature reviews, and potentially leading to better insights into the writing and review processes themselves.

Papers