Apple and Google broke their own rules by promoting “Nudify” apps, the report says.


If you want the app you created to be downloadable from the Apple App Store or Google Play Store, it must pass a large number of standards, including safety standards.

but New report Apple and Google violated their own rules by promoting “nudify” apps banned in their App Store policies, Apple and Google alleged on Wednesday.

The Tech Transparency Project, part of a technology watchdog nonprofit, first revealed in January that the Apple and Google app stores had more than 100 nudity or nudity apps. These are apps whose sole purpose is to take photos of people, usually women, and edit them to look like that person without clothes, creating so-called non-consensual intimate photos. Many of these apps use generative AI to create deep fakes.

Apple removed some banned apps at that time. But many of them are still there, as evidenced by the subsequent investigation.

In April, TTP found that Apple and Google were still allowing users to search for a number of troubling keywords, including “nudify,” “undress,” and “deepnude.” After digging deeper into the top 10 apps across the two app stores, TTP found that 40% of apps advertised themselves as being able to “show women nude or semi-nude,” according to the report.

The new report also found that Google and Apple already promoted such apps in their stores, increasing their visibility, with Google in particular creating “a series of ads for some of the most sexually explicit apps encountered in the investigation.”

Read more: How do we keep children safe online? Europe thinks age verification is the solution

Both Apple and Google have language in their policies that prohibits apps that contain “overtly sexual or pornographic material” (apple) and “sexually suggestive situations in which a person is naked, inconspicuous, or otherwise scantily clad” (Google). Both have imposed these policies in the past – especially through… Stalking porn apps.

But Apple and Google make money from app developers by running ads and taking a cut of paid app subscriptions. Analytics firm AppMagic found that these “stripping” apps have been downloaded 483 million times and generated more than $122 million in lifetime revenue.

“This revenue stream may be the reason why both companies are not vigilant when it comes to hiding apps that violate their policies,” TTP wrote.

Atlas of Artificial Intelligence

Google told CNET that Google Play does not allow apps with sexual content, and that several apps referenced in the report have been suspended for violating its policies.

Apple told CNET that it removed 15 of the apps flagged in the report and has contacted six other app developers, telling them they need to address the issues or risk being removed from the Store. It also blocked several additional search terms flagged by TTP.

Non-consensual sexual content is a growing problem, partly due to artificial intelligence. We saw with startling clarity how AI applications can be used to create this illegal and offensive content at the beginning of the year, when Grok users 1.4 million fake photos of a sexual nature Over a period of nine days.

some US Senators at the time Apple and Google have called for Grok to be removed from their app stores, but neither has removed it.

This week we learned that Apple has specifically reached out to Grok To express concerns about her abusive AI abilities and threatened to remove her. The Grok app is still available in the Apple and Google app stores and is reportedly still available The ability to create offensive sexual images using artificial intelligenceDespite the company Saying Otherwise.



Leave a Reply

Your email address will not be published. Required fields are marked *