Member-only story
The AI deepfake porn pandemic that is destroying lives in South Korea
The use of deepfake technology to create non-consensual, sexually explicit content has become a major issue in South Korea, destroying dozens of lives each month and incentivizing pedophilia and revenge porn. Deepfakes are AI-generated media that superimpose a person’s face over another body, allowing the creation of sexual material without the victim’s consent or knowledge.
In what looks like an IsAnyoneUp kind of controversy, there have been thousands of cases of deepfake porn reported in the last years, with 297 reported cases of deepfake sex crimes, up from 180 in 2022 and 160 in 2021.
The victims span a wide range, including teachers, military personnel, university students and even elementary and middle school students.
Their main channel of communication is Telegram, a messaging app that allows encrypting messages which are virtually impossible to be traced. The perpetrator would usually find pictures of the victims on Instagram or Facebook and then use AI tools to create fake pornographic content. The Telegram groups can have hundreds to thousands of members.
The South Korean government has recognized the severity of the crisis and launched an investigation, aimed especially at protecting the minors involved. Despite that, the government has…