Entropy Rate
Entropy rate, a measure of information production per unit time in a system, is a key concept in information theory with applications across diverse fields. Current research focuses on evaluating its constancy in natural language, particularly examining whether the entropy rate of text remains consistent across different scales and models, with recent studies questioning previously held assumptions using advanced neural language models. Discrepancies between observed power-law decay of cross-entropy in large language models and the theoretical implication of a zero entropy rate for natural language are actively being investigated through the development of simplified stochastic models. These investigations have implications for understanding language processing, model scaling laws, and the efficiency of communication.