UNICEF Statement: Deepfakes Are Increasingly Used to Create Sexual Content Involving Children

Виктор Сизов Exclusive
VK X OK WhatsApp Telegram
UNICEF Statement: Deepfakes are increasingly used to create sexual content involving children

According to a study conducted by UNICEF in collaboration with the international organization ECPAT and Interpol in 11 countries, at least 1.2 million children reported that their images were manipulated to create explicit sexual fakes last year. In some of the surveyed countries, one in every 25 children became a victim of such practices, which corresponds to one child in an average classroom.

Modern children are aware of these risks: in some countries, up to two-thirds of respondents expressed concern that artificial intelligence could be used to create fake sexual images or videos. The level of concern varies across different countries, highlighting the need for increased awareness and strengthened prevention and protection measures.

It is important to note that sexualized images of children created or altered using AI fall under materials related to child sexual abuse (CSAM). The abuse of deepfakes is a serious issue that causes real harm.

When a child's image is used, they become an immediate victim. Even in cases where identifiable victims are not present, such materials contribute to the normalization of child sexual exploitation, create demand for violent content, and complicate law enforcement's efforts to identify and protect children in need of assistance.

UNICEF supports the efforts of AI developers who implement safe approaches and reliable protective measures to prevent the misuse of their technologies. However, the situation remains uneven, and many AI models are developed without adequate safeguards. Risks are increasing, especially when generative artificial intelligence tools are integrated into social networks, leading to the rapid spread of manipulated images.

Among the measures that UNICEF urgently calls for to combat the growing threat of AI-generated child sexual abuse materials are the following:


The issue of deepfake abuse requires urgent action. Children cannot afford to wait for laws to come into effect.
VK X OK WhatsApp Telegram

Read also: