
ChatGPT was used ‘to help scammers do their thing’ in Asia fraud scheme
Duncan Okindo says he was lured to Southeast Asia last year by the promise of a customer service job in Thailand. Instead, he ended up spending four months in a scam compound on the lawless Myanmar-Thai border, where he saw first-hand how criminal groups are using artificial intelligence to commit fraud at scale.
Okindo, 26, says he was struggling to find a job as the breadwinner for his family in his native Kenya when a local recruitment agency promised him work in Bangkok. The flight was his first trip overseas. On landing, he says, he was abducted at the airport and spirited across the border, into the notorious KK Park complex, guarded by heavily armed men and fortified “like it was meant for war.”
The facility where Okindo was held was typical of the region’s scam compounds – complexes largely run by Chinese-led gangs and designed for fraud, where criminals target victims across the globe. He says he worked in a large room with hundreds of other forced laborers, all logged into desktop computers.
Many used a free version of ChatGPT to craft messages designed to trick Americans into making bogus cryptocurrency investments, he told Reuters. Such schemes are known as pig-butchering, in which scammers meticulously cultivate victims’ trust before stealing their money.
Reuters couldn’t independently verify the full details of Okindo’s account. But a representative of HAART Kenya, an anti-trafficking group involved in his rescue, confirmed he was among several Kenyans freed from scam compounds earlier this year.
The outlines of Okindo’s story also conformed with those of about a dozen other forced laborers Reuters has interviewed.
The scam on which Okindo worked involved targeting U.S. real estate agents.
He said he trawled property websites where agents run ads offering their services, including Zillow.com, to find potential victims – referred to as “clients.”
He would reach out in the guise of a wealthy investor, approaching dozens each week.
Zillow declined to comment. ChatGPT owner OpenAI said it “actively works to identify and disrupt scam-related misuse of ChatGPT.”
The AI’s underlying model refuses requests that break OpenAI’s anti-fraud rules, it added, and company investigators watch for abuses and cut off violators.
OpenAI declined to comment further on Okindo’s account.
The goal, set by the bosses, was to convince at least two real estate agents a day to deposit money for non-existent investments, while always communicating with at least 10 “clients.” The deposits ultimately were stolen by the scam operation.