Pre Trained Language Model
Pre-trained language models (PLMs) are large neural networks trained on massive text datasets, aiming to capture the statistical regularities of language for various downstream tasks. Current research focuses on improving PLM efficiency through techniques like parameter-efficient fine-tuning and exploring their application in diverse fields, including scientific text classification, mental health assessment, and financial forecasting, often leveraging architectures like BERT and its variants. The ability of PLMs to effectively process and generate human language has significant implications for numerous scientific disciplines and practical applications, ranging from improved information retrieval to more sophisticated AI assistants.
Papers
FuseGen: PLM Fusion for Data-generation based Zero-shot Learning
Tianyuan Zou, Yang Liu, Peng Li, Jianqing Zhang, Jingjing Liu, Ya-Qin Zhang
Fighting Randomness with Randomness: Mitigating Optimisation Instability of Fine-Tuning using Delayed Ensemble and Noisy Interpolation
Branislav Pecher, Jan Cegin, Robert Belanec, Jakub Simko, Ivan Srba, Maria Bielikova
Knowledge Fusion By Evolving Weights of Language Models
Guodong Du, Jing Li, Hanting Liu, Runhua Jiang, Shuyang Yu, Yifei Guo, Sim Kuan Goh, Ho-Kin Tang
GECOBench: A Gender-Controlled Text Dataset and Benchmark for Quantifying Biases in Explanations
Rick Wilming, Artur Dox, Hjalmar Schulz, Marta Oliveira, Benedict Clark, Stefan Haufe
How Good are LLMs at Relation Extraction under Low-Resource Scenario? Comprehensive Evaluation
Dawulie Jinensibieke, Mieradilijiang Maimaiti, Wentao Xiao, Yuanhang Zheng, Xiaobo Wang
The Devil is in the Neurons: Interpreting and Mitigating Social Biases in Pre-trained Language Models
Yan Liu, Yu Liu, Xiaokang Chen, Pin-Yu Chen, Daoguang Zan, Min-Yen Kan, Tsung-Yi Ho
Experiments in News Bias Detection with Pre-Trained Neural Transformers
Tim Menzner, Jochen L. Leidner
Pcc-tuning: Breaking the Contrastive Learning Ceiling in Semantic Textual Similarity
Bowen Zhang, Chunping Li