Meta Platforms is cracking down on the “concerning growth” of so-called ‘nudify’ apps, as part of its existing rules against non-consensual sexual imagery.
Meta announced on Thursday as part of the crackdown, it is “suing Joy Timeline HK Limited, the entity behind CrushAI apps, which allow people to create AI-generated nude or sexually explicit images of individuals without their consent.”
Joy Timeline had apparently run ads on Meta’s Instagram and Facebook to promote the CrushAI apps, but now Meta has filed a lawsuit in Hong Kong (the base of Joy Timeline HK Limited), in order to prevent the firm from advertising its nudify apps on Meta platforms.
Nudify apps
Meta said the lawsuit follows multiple attempts by Joy Timeline HK Limited to circumvent it’s ad review process and it continued placing these ads, after they were repeatedly removed for breaking its rules.
“This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,” said Meta. “We’ll continue to take the necessary steps – which could include legal action – against those who abuse our platforms like this.”
Meta said that a year ago it had updated its policies to make it even clearer that it doesn’t allow the promotion of nudify apps or similar services.
“We remove ads, Facebook Pages and Instagram accounts promoting these services when we become aware of them, block links to websites hosting them so they can’t be accessed from Meta platforms, and restrict search terms like ‘nudify’, ‘undress’ and ‘delete clothing’ on Facebook and Instagram so they don’t show results,” it said.
Meta also pointed out that nudify apps are being advertised across the internet and are available in App Stores themselves.
“This means that removing them from one platform alone isn’t enough,” it stated.
Going forward Meta said it will remove ads, accounts or content promoting these services, and it will share information with other tech companies through the Tech Coalition’s Lantern program, so they can investigate and take action as well.
Since March 2025 Meta said it has shared “more than 3,800 unique URLs to participating tech companies. We already share signals about violating child safety activity, including sextortion, with other companies, and this is an important continuation of that work.”
Nude images
Meta is not the only tech giant taking action to protect kids and adults from possible sexual explotation.
In 2022 Apple began deploying a Messaging feature that was designed to blur images containing nudity sent to children to a number of countries. It had first launched ‘Communication Safety in Messages’ in 2021, but only in the United States.
Other tech platforms also routinely scan content for child sexual abuse material (CSAM) images.
In 2024 Instagram began testing nudity protection features in DMs (direct messages), which blur images detected as containing nudity. Nudity protection was turned on by default for teens under 18 globally.