Paper ID: 2206.12565
Construct a Sentence with Multiple Specified Words
Yuanliang Meng
This paper demonstrates a task to finetune a BART model so it can construct a sentence from an arbitrary set of words, which used to be a difficult NLP task. The training task is making sentences with four words, but the trained model can generate sentences when fewer or more words are provided. The output sentences have high quality in general. The model can have some real-world applications, and this task can be used as an evaluation mechanism for any language model as well.
Submitted: Jun 25, 2022