NEW DELHI: Certain applications and websites that employ AI technology to strip nude women are on the rise in cyberspace. This has led to a smear campaign on the internet with deep fake videos often appearing across social media. According to a report released by social networking firm ‘Graphica’, 2.40 crore people visited such websites and applications in September alone.
According to the report, people visit these websites to seek technological help to morph nude images of women, that will look dangerously real. Advertisements for such apps have increased by 2400 per cent on social media platforms. According to the researchers, ads appear mostly on platforms like X and Reddit.
With the increased use of such apps, there is growing concern that women's photos may be used for pornographic images without their consent. The recently discussed 'deepfake pornography' is also part of such apps. The proliferation of these apps can lead to serious violations of the law.
An ad for an application that undresses women recently appeared on X. Advertising consists of undressing the person in the photo of your choice and sending that image to others. A sponsored ad for one of the apps was also visible on YouTube. This app appears when searching for the word 'nudify'.
But a Google spokesperson clarified that it will never promote sexual content. He added that such ads will be reviewed and those that violate Google's policies will be removed.