AI Gone Wrong? Fake Lingerie Images of Popular Announcer Sold in Online Auctions
Do you remember the word “Icola”?
The “Ai” in “Icola” doesn’t refer to artificial intelligence but to “idol,” and “cola” comes from “collage.” It’s a coined term that referred to photos where an idol’s face was combined with someone else’s body, usually in swimwear or nude, below the neck. This practice surged in Japan in the early ’90s, causing quite a stir. Recently, images of a popular female announcer, created using what can be called the “latest Icola techniques,” were being sold on a major online auction site.
Iconic images of popular female announcers of commercial TV stations
A poster titled “Super Beautiful Announcer (real female announcer’s name) in Knit Lingerie, Size A1” was listed for sale at “Current price: 3,000 yen, Buy Now: 7,500 yen, Shipping to Tokyo: 880 yen.” At least four posters featuring this announcer wearing sexy lingerie were being sold.
While it was clear at a glance that these were “Icola” images, they were crafted much more skillfully than in the past. As of late August, these images were still being sold on the online auction site.
“These images, created using AI technology, are called deepfake images. Just recently, NHK reported on an incident where a high school girl’s casual photos or yearbook images uploaded to social media in the U.S. were manipulated to create nude images and widely disseminated.
In South Korea, incidents where fake sexual images of women, often acquaintances or classmates, are created using photos taken from social media and shared via messaging apps like Telegram have been happening frequently,” said a reporter from a national newspaper’s social affairs department.
In fact, before this kind of technology emerged, composite photos already existed, but they were mostly crude, created by cutting and pasting photos together and re-photographing them. However, as personal computers became widespread and photo editing software became affordable, even amateurs could easily create Icola images.
As the internet became more common, the number of people posting Icola images of not only idols but also actresses and athletes increased. This raised concerns about copyright and image rights violations, as well as the possibility of interfering with celebrities’ professional reputations. Icola became a significant social issue. However, the entertainment industry did not stay silent.
“There are still people who buy Icola images.”
In September 2007, a female announcer from a local station filed a criminal complaint with the Osaka Prefectural Police against a magazine company for publishing false articles using an Icola image that featured her face, obtained from the company’s website. This case was based on allegations of defamation and violations of copyright law.
It had long been pointed out that publishing Icola images could constitute a criminal act, and it was generally assumed that no one would engage in commercial activities involving Icola. However, it was shocking to see Icola posters being openly sold on a major online auction site.
The current images can be immediately identified as Icola, yet many people still purchase these posters.
“People who buy them know they are Icola images. Even if they’re fake, the idea of a prim and proper female announcer in an impossible situation is irresistible to enthusiasts,” said the aforementioned editor.
Is the television station where A works aware of this situation? Upon inquiry, it appeared they had not noticed, and they seem to be taking prompt action. They provided this response:
“We have taken necessary measures, including deletion requests, against illegal images and videos in the past. Moving forward, we will continue to adopt a strict stance against such images and videos.”
By mid-September, I checked again, and the sales page for the poster was no longer available on the auction site. It is believed that the television station requested the removal of the seller’s listing. Lawyer Yuka Koto from “Bengoshi Hojin Hibiki” explained the situation as follows.
Some images have been viewed more than 3 million times.
“AI-generated images are new images created based on original data. For example, if an image generated from multiple original data resembles a real person, questions about the identity and relevance between the original data and the AI-generated image may arise. However, in this case, since the image is being sold as belonging to the female announcer, even if it is an AI image, there shouldn’t be any issues regarding identity or relevance.
Given this premise, creating and publishing an image of a person that they do not wish to see is considered defamation against that individual. In this case, images featuring underwear or swimwear have a high degree of privacy and can be deemed sexual in nature. Therefore, even if permission for publication were sought, it is highly likely that the individual would not want the images to be made public.
A similar case involved the administrator of a site that published adult videos using deepfake technology to replace the faces of performers with those of famous celebrities, who was arrested on charges of defamation (in November 2020, a systems engineer and a university student were arrested by the Tokyo Metropolitan Police and Chiba Prefectural Police for uploading the videos to the internet).
Additionally, the person who took the original photos holds the copyright to those images. If the images are used without the copyright holder’s permission, it also constitutes copyright infringement against the copyright holder,” explained lawyer Yuka Koto from “Bengoshi Hojin Hibiki.”
So, if a victim files a report, what kind of investigation would take place, what charges would be filed, and what could the penalties be?
“As mentioned, defamation charges (Article 230, Paragraph 1 of the Penal Code) would apply against the original person in the data, which could lead to a penalty of imprisonment for up to three years, detention, or a fine of up to 500,000 yen. The sentencing would vary depending on factors such as the number of products sold and the content of the images (their severity), which affects how much the victim’s reputation has been harmed. Therefore, it is currently unclear what the sentencing outlook would be,” noted lawyer Koto.
There are deepfake videos of Japanese actresses and idols on pornographic video sites, with some having over three million views. Many deepfake images of female celebrities are still available online.
While the intention may be for profit, it’s essential to recognize that creating and publishing Icola images for amusement also constitutes a criminal act.
Interviews and minutes: Hiroyuki Sasaki (Entertainment Journalist)