Paper ID: 2405.17076

Leveraging small language models for Text2SPARQL tasks to improve the resilience of AI assistance

Felix Brei, Johannes Frey, Lars-Peter Meyer

In this work we will show that language models with less than one billion parameters can be used to translate natural language to SPARQL queries after fine-tuning. Using three different datasets ranging from academic to real world, we identify prerequisites that the training data must fulfill in order for the training to be successful. The goal is to empower users of semantic web technology to use AI assistance with affordable commodity hardware, making them more resilient against external factors.

Submitted: May 27, 2024