Long Form Novel
Long-form novel generation is a challenging area of artificial intelligence research focused on creating coherent and engaging narratives of substantial length. Current research emphasizes hierarchical frameworks, often incorporating large language models (LLMs) and techniques like prompt learning, to improve plot coherence, character development, and overall narrative quality. These advancements leverage various neural network architectures, including transformers and convolutional neural networks, and explore novel approaches such as data augmentation and knowledge transfer to enhance model performance. The development of robust and creative long-form text generation has significant implications for creative writing, storytelling, and other applications requiring the generation of complex and lengthy textual content.
Papers
A Novel Prompt-tuning Method: Incorporating Scenario-specific Concepts into a Verbalizer
Yong Ma, Senlin Luo, Yu-Ming Shang, Zhengjun Li, Yong Liu
Are Language Models More Like Libraries or Like Librarians? Bibliotechnism, the Novel Reference Problem, and the Attitudes of LLMs
Harvey Lederman, Kyle Mahowald