- South Korea has passed a law banning the possession and viewing of deepfake porn.
- Violators face up to three years in prison, or a fine equivalent to about $23,000.
South Korean lawmakers have passed a law that would ban the possession and watching of deepfake porn, with severe penalties for those caught violating it.
Anyone caught watching, saving, or purchasing such material could face up to three years in prison, according to Reuters.
The news agency said that an alternative punishment would be a fine of up to 30 million won, equivalent to about $22,870 in US currency.
The law was passed on Thursday and is awaiting the president's approval to be enacted.
South Korea had already criminalized the creation of sexually explicit deepfakes. Those caught creating such content in order to distribute it face up to five years in prison or a fine of 50 million won, about $38,109.
The new legislation aims to expand on the earlier law by targeting the consumers of deepfake porn, as South Korea grapples with sexually explicit AI-generated content.
A report on deepfakes by Security Hero, a US-based identity theft protection startup, found that in 2023, South Korean singers and actresses were the most commonly targeted group, making up 53% of the individuals featured in deepfake pornography.
Earlier this month, South Koreans protested in Seoul to demand an end to non-consensual deepfake porn.
These videos are often found in Telegram chatrooms, per Reuters.
According to AFP, authorities last month discovered a vast network of these deepfake porn chatrooms, which sometimes targeted school and university staff and students.
South Korean regulators met with Telegram as part of a nationwide crackdown, which led to the removal of 148 videos, per AFP.
Security Hero said that in 2023, the total number of deepfake videos online was 95,820, with 98% of those being pornographic in nature.
In the US, there is a bipartisan effort, led by Sens. Ted Cruz and Amy Klobuchar, to pass legislation that would criminalize the publication of non-consensual, sexually exploitative images, including AI-generated deepfakes.