KG to Text

Knowledge graph (KG) to text generation focuses on automatically converting structured knowledge from KGs into natural language text, aiming to create fluent and informative descriptions. Current research emphasizes improving the faithfulness and informativeness of generated text, exploring model architectures like autoregressive models, graph neural networks, and transformer-based seq2seq models, often incorporating techniques like contrastive learning and memory augmentation. This field is significant for enhancing knowledge access and facilitating applications such as question answering, recommendation systems, and text simplification, where converting structured data into human-readable formats is crucial.

Papers