Paper ID: 2406.00199

Exfiltration of personal information from ChatGPT via prompt injection

Gregory Schwartzman

We report that ChatGPT 4 and 4o are susceptible to a prompt injection attack that allows an attacker to exfiltrate users' personal data. It is applicable without the use of any 3rd party tools and all users are currently affected. This vulnerability is exacerbated by the recent introduction of ChatGPT's memory feature, which allows an attacker to command ChatGPT to monitor the user for the desired personal data.

Submitted: May 31, 2024