Artificial intelligence (AI) is a leading tool for accelerating the spread of images depicting sexual exploitation and abuse on the internet, according to a survey released on Tuesday (Feb. 6) by the non-governmental organization (NGO) Safernet.
“The proliferation of generative AI apps allows you to take a photo of a dressed person and remove that person’s clothes,” explained Thiago Tavares, founder and CEO of SaferNet Brasil. “This can be manipulated, generate synthetic media, and represent that person in an ultra-realistic image of nudity.”
The topic was discussed at the Safer Internet Day event, which concludes this Wednesday (7) in São Paulo. During one of the sessions, Iain Drennan, the executive director of the WeProtect Global Alliance, highlighted that artificial intelligence is a new trend in sexual abuse cases worldwide. “You simply use a synthetic image, you don’t even use a real image. And this can be taken to an industrial scale. Criminals can access available AI or extended reality technology,” he explained. “This is material that may not be illegal, but is being used for sexual purposes,” he warned.
The issue is broad and encompasses more than just image manipulation. Drennan also mentioned cases of rape that have occurred in virtual reality or metaverse rooms and are under investigation in some countries. “Are virtual reality rooms safe for girls? We need legislation to address this type of crime.”
According to the expert, the groups most vulnerable to this type of sexual exposure on the internet include children with disabilities and LGBT individuals. “We need to handle this issue sensitively to ensure that these groups do not feel further marginalized or vulnerable,” he said.
Drennan stated that legislation is one tool to prevent the proliferation of this type of crime on the internet, but more action is needed. “We require assistance from the government, the private sector, and civil society, as well as empowerment of parents and guardians. However, this should not be solely the responsibility of families and children,” he emphasized.
The Safernet survey also revealed that reports of child sexual abuse and exploitation images on the internet reached a record high in 2023, totaling 71,867 complaints. “This represents an absolute record in terms of the number of new URLs (addresses or web pages) reported since the institution was created in 2005. It is the peak of the 18-year historical series,” said Thiago Tavares.
According to the NGO, three main factors have led to the increase in reports of child sexual abuse and exploitation images. These include the use of artificial intelligence to create such content; mass layoffs by major tech companies, which have impacted the security, integrity, and content moderation teams of some platforms; and the proliferation of the sale of self-generated nude and sexual images by teenagers.