logo

FX.co ★ Apple Removed AI Apps From Its Appstore Which Promoted Creating Non-consensual Nude Images

Apple Removed AI Apps From Its Appstore Which Promoted Creating Non-consensual Nude Images

Apple Inc. has removed several AI-based image generator apps that were allegedly marketed for producing unauthorized explicit images. A recent inquiry by 404 Media uncovered how certain companies exploited Instagram ads to promote apps capable of virtually rendering individuals nude without their consent.

Some of these ads directed users to Apple's Store to an app marketed as an "art generator," built specifically for creating non-consensual explicit images. The apps featured capabilities like face-swapping in adult content and digitally removing attire from photos. The investigation not only revealed the existence of these apps but also underscored their promotion on widely used advertising platforms.

Initially, these apps were advertised through Meta's Ad Library on Instagram. However, once these ads were identified and flagged, Meta promptly removed them.

Initially, Apple did not respond to inquiries from 404 Media. However, after receiving detailed information such as direct links to the apps and their ads, Apple acted swiftly.

According to 404 Media, "Apple eliminated a total of three apps from the App Store, but only after we furnished the company with links to the specific apps and their associated ads. This suggests that Apple couldn't independently locate the policy-violating apps."

While Apple's removal of these apps from the App Store is commendable, there are still lingering issues. For instance, Apple did not prevent these apps during its App Store Review process and had to rely on external entities to draw attention to the existence of these apps.

*यहाँ दिया गया बाजार का विश्लेषण आपकी जागरूकता को बढ़ाने के लिए है, यह ट्रेड करने का निर्देश नहीं है
लेख सूची पर जाएं ट्रेडिंग खाता खोलें