Stolen ChatGPT conversations have been found on the Dark Web, according to Singapore-based cybersecurity firm Group-IB. The theft and publication of ChatGPT conversations reveals a danger about the software that many users may not know.
According to Group-IB’s data, nearly 27,000 ChatGPT conversations were offered for sale on the Dark Web in May 2023. The majority of these data were stolen from India and Pakistan using malware during the past year. The United States had the sixth-largest number of stolen conversations, at 2,995, just ahead of France, which led Europe with 2,923 conversations.
What Makes ChatGPT Conversations Vulnerable?
Conversations with ChatGPT take place using a browser or through a remote connection to a ChatGPT server in the overwhelming majority of cases. If you have a local installation of ChatGPT that you access directly via a LAN, with no connection to the Internet, you are at a much lower risk for data theft, but such installations remain rare.
Hackers can steal ChatGPT conversations as they happen in one of three ways:
- Using malware programs such as Raccoon, which exfiltrate data from an infected device.
- Using eavesdropping software that captures communications as they move back and forth between a ChatGPT server.
- Hacking a ChatGPT account and directly downloading past conversations.
The third method of attack is the one that may surprise many ChatGPT users. By default, ChatGPT saves your prompts and the logs of your conversations. If hackers can gain access to your account, they may be able to download complete transcripts of your past conversations. This could include sensitive business data, software code or personal information that could be used to compromise your identity or your business.
The current global distribution of ChatGPT theft may not appear to be a threat to North American users, but this is a mental trap. Hackers may be targeting particular industries or businesses overseas, but the techniques and methods they learn spread almost instantly across the globe. More ChatGPT theft will happen, and more U.S. businesses will be targeted. The only good news is that you have time to prepare.
How to Prevent ChatGPT Conversation Theft
There are a few steps ChatGPT users should take immediately to prevent data loss.
- Scan your devices for malware. This should be a common, regular practice at home and at work. Keyloggers and malware can creep onto your devices even if you practice great cyber security habits. Regular scans offer confirmation that your devices are clean.
- Disable your ChatGPT history. To do this, access the Settings in your account and turn off Chat History & Training. This forces ChatGPT to dump any conversations that are more than 30 days old. Be sure to save any conversations you want to keep outside of the ChatGPT interface, using Microsoft Word, Notepad or another program that resides on your hard drive.
- Clear your old conversations. To do this, click on your profile picture, then click on Clear Conversations. This will give you the option to remove all of your archived ChatGPT conversations.
- Beware of what you share. Even with these steps, ChatGPT will store conversations for 30 days. It is best to avoid using ChatGPT to compose documents with sensitive business information that could be valuable to rivals, or to completely write code that powers proprietary software, as these could easy be stolen in the event of a breach. Do not give personal details to ChatGPT, such as your address, phone, email, login credentials or bank and credit card numbers. Hackers will mine ChatGPT logs for this information.
- Protect your ChatGPT account as fiercely as your bank account. Never share any login information for your ChatGPT account with anyone under any circumstances. If possible, use two-factor authorization or a password manager to log in to your ChatGPT account. In cases where a single account is shared across an organization, every individual user should have their own login with two-factor authentication or a password manager for additional security.
The explosive growth of ChatGPT and its brand-new capabilities provide fertile ground for criminals. The majority of ChatGPT users probably have not considered conversation log theft as a cyber security risk, but it can be, depending on how you use this AI tool. As criminals probe new ways to harvest data from AI systems, remember that basic cyber security employee training, such as our CSI Protection Certification, will prepare employees to use new online tools with a much lower degree of risk.