In a move to protect users’ privacy and safety, Apple has removed several AI image generator applications from its App Store that promoted the creation of non-consensual nude images. This decision came after an investigation by 404 Media exposed how companies were utilizing Instagram advertisements to support these apps, which could undress individuals without their permission.
The investigation revealed that some of these advertisements led users directly to Apple’s App Store for an application marketed as an “art generator,” specifically designed to generate non-consensual nude images. These applications provided features such as face-swapping in adult images and digitally removing clothing in photos. The investigation not only exposed the presence of these applications but also highlighted their promotion through popular advertising platforms.
However, it is concerning that Apple did not prohibit these applications during its App Store Review process and had to rely on external parties to alert them to the existence of these applications.
Apple took prompt action after receiving detailed information, including direct links to the applications and their advertisements. In total, three applications were eliminated from the App Store. However, it is concerning that Apple did not prohibit these applications during its App Store Review process and had to rely on external parties to alert them to the existence of these applications.
The removal of these applications from the App Store is a positive step towards creating a safer and more ethical digital environment. It highlights the importance of tech companies taking responsibility for the content available on their platforms and ensuring that users’ privacy and safety are protected.
In the rapidly evolving world of AI technology, it is crucial for companies like Apple to stay vigilant and proactive in addressing potential misuse and harmful applications. By doing so, they can maintain users’ trust and contribute to the development of ethical and responsible AI practices.