Paper ID: 2304.14590
A logical word embedding for learning grammar
Sean Deyo, Veit Elser
We introduce the logical grammar emdebbing (LGE), a model inspired by pregroup grammars and categorial grammars to enable unsupervised inference of lexical categories and syntactic rules from a corpus of text. LGE produces comprehensible output summarizing its inferences, has a completely transparent process for producing novel sentences, and can learn from as few as a hundred sentences.
Submitted: Apr 28, 2023