Ideal Opposite Sex with a Single Smartphone and Generated AI…No Chance for Conscience to Suffer Youngsters Committing International Romance Scams
Alarming Document [Part 2] "Thanks to Generative AI, I'm making a fortune" "I don't feel guilty..."...

Romance fraud is a type of scam in which a person pretends to be someone else, and deceives the target, whom he or she has met through a social networking service or matching application, by making him or her fall in love with him or her, and then cheats him or her out of money. According to the National Police Agency, there were 5,604 cases in Japan in 2013, and the amount of damage was 55.2 billion yen, up 38% from the previous year.
AI allows us to speak young people’s language freely.
According to Adam (a pseudonym, 46), a Nigerian man who mediated Jason’s interview, more advanced generation technology will make it possible to make video calls.
When a woman speaks in front of a computer camera, for example, a video of another, ideal woman speaking can be played on the target’s side,” he said. This is a proprietary technology developed by a Nigerian friend of mine. It’s expensive, but it makes them trust you and makes it easier to pull in large sums of cash.
The messages have been canned. We used to copy and paste or slightly rearranged sentences full of affectionate pickup lines, but nowadays, “we let ChatGPT ″conversate″ with us,” said another AI boy, Hassan (pseudonym, 23).
“With the application of the format, the messages tended to be one-patterned, and I sometimes got caught out. With the use of the generative AI, it is very effective, because it can freely use young people’s language and slang to create a natural message that fits the situation. I’ve fooled over 100 people so far, with a maximum of $7,000 per person, and thanks to the AI, I’m raking in the cash.
The room Jason showed me for this interview was furnished with a TV, a sofa, and a reception set (first photo), and the atmosphere was far from poverty-stricken. Jason and Hassan are active university students with smartphones, and they are not committing cyber crimes because of their poverty.
Rather, I got the impression that they considered it a “good part-time job” that they started because they were influenced by the students around them. As if to confirm this impression, there was no hint of apology in their conversations. They were aware that it was a crime, but they did not feel guilty at all.
It’s a necessary cost of living.
“I think it’s the price I pay for sending them naked pictures of women and entertaining them.
I heard him say many times, “It’s a necessary expense to survive,” and “I think it’s the price I pay for sending naked pictures of women and entertaining them.
In Nigeria, there are organized crime syndicates and “con kings” who fleece large numbers of their targets for large sums of money.
However, most of them, like Jason and his colleagues, are ordinary young people with smartphones in their hands, who cheat foreigners out of their money as if they were part-timers, and there are few organized criminal groups with a leader and an executor.
Since the crime is completed with a single smartphone without ever meeting the target, there is no opportunity for conscience to be pricked. This seemed to induce their ″evilness″.
A time when everyone is fooled.
Still, there was one incident in which the National Police Agency got on the tail of the AI boys. Last February, in cooperation with local authorities and others, they arrested 11 Nigerians for their involvement in a romance scam. The victims were 14 Japanese, and the total amount of damage was about 150 million yen.
However, due to the low hurdles involved, which can be overcome by anyone with a smartphone, there are countless perpetrators, and the situation is far from being fully exposed.
Ms. Terue Shinkawa, the head of CHARMS, a nonprofit organization that supports victims of romance scams, says that she receives an average of 15 calls every month from victims who say that they have made video calls to the perpetrators.
One of the victims was pretending to be an actual Asian actor. In the video call, the other party’s facial image was broken, so we think it was a deepfake (fake video). In one case, the caller posed as a white male, but his English speech was unnatural. There is a possibility that the voice was created.
As the accuracy of AI improves, these minor “flaws” will be improved. Mr. Shinkawa sounds a warning.
Many people think, ‘I am the only one who is not fooled,’ but that is an overconfidence bias that everyone has. I want people to pay close attention to the information on the damage.
At this very moment in Nigeria, AI boys are whispering seductive words to their targets with smartphones in hand.


From the April 10, 2026 issue of FRIDAY
Photography and text: Takehide Mizutani (Nonfiction writer)
