Apple launches iMessage nudity alerts for everyone

Apple has begun to roll out globally one of the tools that it announced last year to guarantee the protection of those under 13 years of age who use an iPhone, iPad or Mac. The company, in particular, has activated in the United Kingdom, Canada, Australia and New Zealand, nudity alerts in Messagesa function also known as "communication security in Messages", which aims to prevent children from seeing explicit images received through the messaging service.

This tool, which must be activated manually through the settings, performs a scan of the photos that are sent through the Messages app with the aim of detecting nudity or any other type of explicit sexual content. All this, yes, preserving the end-to-end encryption of the multimedia files. If the system detects that an image is explicit because it contains nudity or any other type of sexual activity, it will blur it and display a message in chat warning that the image may be sensitive.

Messages uses on-device machine learning to analyze image attachments and determine if a photo appears to contain nudity. The feature is designed so that Apple does not have access to the photos.

Manzana.

Apple will propose a series of measures to minors who receive explicit images

The app will also allow the user to choose between writing a message to an adult contact to alert them that they have received a sensitive image or blocking the contact. It will also display a shortcut to "other ways to find help." In the latter case, Apple will redirect the user to a website with resources and tips for not continuing the conversation.

If the user finally decides to choose to view the photo, the Messages app will alert again about the type of content that the image can deal with. It will also offer the possibility of ignoring it or contacting an adult.

Apple, on the other hand, has also started to activate an additional child protection feature. In this case, intended for searches from Spotlight, Siri, and Safari searches, depending on TheVerge. When a user accesses one of these three services for search topics related to child sexual abusethe results that will be displayed will offer content and security resources.

Meanwhile, the function of CSAM continues without be available. This tool, remember, scans iPhone photos before they are uploaded to iCloud to detect child sexual material. Apple ended up delaying its launch due to multiple concerns from users and security and privacy experts.

Leave a Reply

Your email address will not be published. Required fields are marked *

Go up