Soft Prompt
Soft prompting is a parameter-efficient technique for adapting large language models (LLMs) to specific tasks by learning short sequences of "soft prompts" that are prepended to the input text, guiding the model's output without modifying its core weights. Current research focuses on improving the efficiency and effectiveness of soft prompt learning, including methods for generating dynamic prompts, transferring prompts between tasks, and mitigating vulnerabilities to adversarial attacks. This approach offers significant advantages in resource-constrained settings and for enhancing model robustness and safety, impacting both the efficiency of LLM deployment and the development of more secure and reliable AI systems.
Papers
March 19, 2024
February 14, 2024
November 12, 2023
November 10, 2023
October 10, 2023
October 8, 2023
September 18, 2023
August 29, 2023
June 6, 2023
May 22, 2023
May 17, 2023
May 10, 2023
April 30, 2023
April 4, 2023
March 2, 2023
February 17, 2023
February 14, 2023
November 13, 2022
November 10, 2022
October 23, 2022