Gram Language Model

Gram language models, particularly n-gram models, remain a significant area of research in natural language processing, focusing on improving their efficiency and integration with neural network architectures like transformers and neural transducers. Current research explores optimizing n-gram models for various applications, including speech recognition, handwriting recognition, and authorship verification, often through techniques like shallow or linear fusion with neural language models, and investigating their effectiveness in low-resource settings. These advancements contribute to improved performance in numerous applications by leveraging the strengths of both n-gram and neural approaches, offering a balance between computational efficiency and accuracy.

Papers