Probabilistic Context Free Grammar
Probabilistic Context-Free Grammars (PCFGs) are probabilistic models used to represent the hierarchical structure of sequential data, primarily in natural language processing but also finding applications in other fields like symbolic regression and music analysis. Current research focuses on improving PCFG induction methods, often integrating them with neural architectures like Transformers and LSTMs to leverage the strengths of both symbolic and neural approaches, and exploring efficient inference techniques for large-scale applications. These advancements are leading to improved performance in tasks such as parsing, machine translation, and content generation, and offer new insights into the underlying structure of complex data.
Papers
Finding path and cycle counting formulae in graphs with Deep Reinforcement Learning
Jason Piquenot, Maxime Bérar, Pierre Héroux, Jean-Yves Ramel, Romain Raveaux, Sébastien Adam
Circuit Compositions: Exploring Modular Structures in Transformer-Based Language Models
Philipp Mondorf, Sondre Wold, Barbara Plank