TechieTricks.com
Apple to release changed iMessage nudity detecting tool for children in beta Apple to release changed iMessage nudity detecting tool for children in beta
Apple proposed tools to protect children in the summer, but has since delayed their release. James Martin/CNET Apple will begin beta testing a feature... Apple to release changed iMessage nudity detecting tool for children in beta


apple-security-keys-fbi-2158.jpg

Apple proposed tools to protect children in the summer, but has since delayed their release.


James Martin/CNET

Apple will begin beta testing a feature for its iMessage text messaging app designed to protect children from sending or receiving nude images, the company said Tuesday. The new feature, which Apple adjusted after receiving feedback from critics, is part of a series of new capabilities designed to fight child exploitation.

Apple’s new messages feature will analyze an attachment in a text message or iMessage sent to users marked as children to determine if it contains nudity. Apple said it will maintain message encryption as part of the process, and it’s designed so that no indication of the detection or nudity leave the device.

The tech giant also said that it’s changed how the system works. Initially, Apple intended to alert parents of children under the age of 13 if they viewed or sent the image anyway. Apple now will allow the children to choose whether or not to alert someone they trust. And the choice is separate from whether or not they view the image.

communication-safety-in-messages.png

Apple’s new system for child safety in iMessages.


Apple

Apple’s move marks the latest in its efforts to build child protection tools into its devices. Earlier this year, Apple announced plans to build both its iMessage system as well as a feature to detect child exploitation imagery being stored on some Apple devices. Apple said it built technology that would keep its servers from scanning the images, as many other companies including Facebook, Microsoft and Twitter do today. Instead, it would scan images on the phone.

The tech giant’s message detection feature is separate from its plans to scan for child exploitation imagery, which the company announced a delay to in September.

As part of its messages feature, Apple said it’s expanding guidance its Siri voice assistant will provide when children or parents ask about problematic issues. That includes information about how and where to file reports about child exploitation.



Source link

techietr