Soft Prompt
Soft prompting is a parameter-efficient technique for adapting large language models (LLMs) to specific tasks by learning short sequences of "soft prompts" that are prepended to the input text, guiding the model's output without modifying its core weights. Current research focuses on improving the efficiency and effectiveness of soft prompt learning, including methods for generating dynamic prompts, transferring prompts between tasks, and mitigating vulnerabilities to adversarial attacks. This approach offers significant advantages in resource-constrained settings and for enhancing model robustness and safety, impacting both the efficiency of LLM deployment and the development of more secure and reliable AI systems.
Papers
October 15, 2024
October 10, 2024
October 8, 2024
September 20, 2024
August 2, 2024
July 12, 2024
July 3, 2024
July 2, 2024
June 26, 2024
June 15, 2024
June 7, 2024
May 31, 2024
May 28, 2024
April 30, 2024
April 3, 2024
April 1, 2024
March 21, 2024
March 19, 2024
February 14, 2024