Paper ID: 2305.04673

PreCog: Exploring the Relation between Memorization and Performance in Pre-trained Language Models

Leonardo Ranaldi, Elena Sofia Ruzzetti, Fabio Massimo Zanzotto

Pre-trained Language Models such as BERT are impressive machines with the ability to memorize, possibly generalized learning examples. We present here a small, focused contribution to the analysis of the interplay between memorization and performance of BERT in downstream tasks. We propose PreCog, a measure for evaluating memorization from pre-training, and we analyze its correlation with the BERT's performance. Our experiments show that highly memorized examples are better classified, suggesting memorization is an essential key to success for BERT.

Submitted: May 8, 2023