Apple to Scan iPhones for Child Sex Abuse Images and Report to Law Enforcement

Tech giant Apple is adding a series of new child-safety features to its next big operating system updates for iPhone and iPad. As part of iOS 15 and iPad iOS 15 updates later this year, the tech giant will implement a feature to detect photos stored in iCloud Photos that depict sexually explicit activities involving children.

“This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC),” the company said in a notice on its website. NCMEC acts as a reporting center for child sexual abuse material (CSAM) and works in collaboration with law enforcement agencies across the U.S.

According to Apple, its method of detecting known CSAM is “designed with user privacy in mind.” The company says it is not directly accessing customers’ photos but instead is using a device-local, hash-based matching system to detect child abuse images. Apple says it can’t actually see user photos or the results of such scans unless there’s a hit.

In addition, with Apple’s iOS 15 update, the iPhone’s Messages app will add new tools to warn children and their parents if they are receiving or sending sexually explicit photos.

Please follow and like us:

Leave a Reply

Your email address will not be published.

Social Share Buttons and Icons powered by Ultimatelysocial