Stolen ChatGPT premium accounts up for sale on the dark web

Credit to Author: avenkat@idg.com| Date: Fri, 14 Apr 2023 06:02:00 -0700

Trade of stolen ChatGPT account credentials, especially those of the premium accounts, is on a rise on the dark web since March, enabling cybercriminals to get around OpenAI’s geofencing restrictions and get unlimited access to ChatGPT, according to research by Check Point.

“During the last month, CPR (Check Point Research) observed an increase in the chatter in underground forums related to leaking or selling compromised ChatGPT premium accounts,” Check Point said in a blog post. “Mostly those stolen accounts are being sold, but some of the actors also share stolen ChatGPT premium accounts for free, to advertise their own services or tools to steal the accounts.”

Researchers have observed various kinds of discussions and trades related to ChatGPT on the dark web over the past month.

The latest activity on the dark web in terms of ChatGPT includes leak and free publication of credentials of ChatGPT accounts, and trade of stolen premium ChatGPT accounts. 

Cybercriminals are also trading brute forcing and checkers tools for ChatGPT. These tools allow cybercriminals to hack into ChatGPT accounts by running huge lists of email addresses and passwords, trying to guess the right combination to access existing accounts.

Also on offer is ChatGPT account as a service — a dedicated service that offers to open ChatGPT premium accounts — most likely using stolen payment cards, Check Point said in its blog. 

Cybercriminals are also offering a configuration file for SilverBullet that allows checking a set of credentials for OpenAI’s platform in an automated way, Check Point said. 

SilverBullet is a web testing suite that allows users to perform requests toward a target web application. The same is used by cybercriminals as well to conduct credential stuffing and account checking attacks against different websites, and thus steal accounts for online platforms.

In the case of ChatGPT, researchers said, this enables them to steal accounts on scale. The process is fully automated and can initiate between 50 to 200 checks per minute. Also, it supports proxy implementation which in many cases allows it to bypass different protections on the websites against such attacks. 

“Another cybercriminal who focuses only on abuse and fraud against ChatGPT products, even named himself ‘gpt4’. In his threads, he offers for sale not only ChatGPT accounts but also a configuration for another automated tool that checks a credential’s validity,” Check Point said. 

An English-speaking cybercriminal started advertising a ChatGPT Plus lifetime account service, with 100% satisfaction guaranteed on March 20th, Check Point said.

The lifetime upgrade of a regular ChatGPT Plus account opened via email provided by the buyer costs $59.99 while OpenAI’s original legitimate pricing of this service is $20 per month.

“However, to reduce the costs, this underground service also offers an option to share access to ChatGPT account with another cybercriminal for $24.99, for a lifetime,” Check Point said.

There is a huge demand for stolen credentials of premium ChatGPT accounts as it can help cybercriminals surpass the geofencing restrictions imposed by it. ChatGPT has geofencing restrictions that restrict the use of the service in certain geographies such as Iran, Russia, and China. 

However, using the ChatGPT API, cybercriminals can bypass the restrictions and use the premium accounts as well, Check Point said. 

Another potential use for cybercriminals is to gain personal information. ChatGPT accounts store the recent queries of the account’s owner.

“So, when cybercriminals steal existing accounts, they gain access to the queries from the account’s original owner. This can include personal information, details about corporate products and processes, and more,” Check Point said in the blog.

In March, Microsoft-backed OpenAI revealed that a Redis client open source library bug had led to a ChatGPT outage and data leak, where users could see other users’ personal information and chat queries.

Chat queries and personal information such as subscriber names, email addresses, payment addresses, and partial credit card information of approximately 1.2% of ChatGPT Plus subscribers were exposed, the company acknowledged.

There have been various privacy and security concerns around ChatGPT coming forward in the last few months. Italy’s data privacy regulator has already banned ChatGPT over alleged privacy violations relating to the chatbot’s collection and storage of personal data. The authorities said they will lift the temporary ban on ChatGPT if OpenAI met a set of data protection requirements by April 30.

The German data protection commissioner has also warned that ChatGPT may face a potential block in Germany due to data security concerns.

Meanwhile, earlier this week, OpenAI, announced a bug bounty program inviting the global community of security researchers, ethical hackers, and technology enthusiasts to help the company identify and address vulnerabilities in its generative artificial intelligent systems.

OpenAI will hand out cash rewards ranging from $200 for low-severity findings to up to $20,000 for exceptional discoveries. 

http://www.computerworld.com/category/security/index.rss