Regular Language

Regular languages, sets of strings defined by finite state automata, are a fundamental concept in theoretical computer science with applications in diverse fields like natural language processing and network security. Current research focuses on efficiently representing regular languages using recurrent neural networks (RNNs), including exploring the limitations and capabilities of various RNN architectures like LSTMs and transformers in learning and generalizing from these languages. This research is significant because it bridges theoretical formal language theory with practical machine learning, leading to improved algorithms for tasks such as regular expression matching and grammatical inference, and offering insights into the inductive biases of neural networks.

Papers