If Apple then detects a certain number of violating files in an iCloud account, the system will upload a file that allows Apple to decrypt and see the images on that account. A person will manually review the images to confirm whether or not there’s a match.

The following report is from CNBC:

Apple will report images of child exploitation uploaded to iCloud in the U.S. to law enforcement, the company said on Thursday.

The new system will detect images called Child Sexual Abuse Material (CSAM) using a process called hashing, where images are transformed into unique numbers that correspond to that image.

Apple started testing the system on Thursday, but most U.S. iPhone users won’t be part of it until an iOS 15 update later this year, Apple said.

The move brings Apple in line with other cloud services which already scan user files, often using hashing systems, for content that violates their terms of service, including child exploitation images.

It also represents a test for Apple, which says that its system is more private for users than previous approaches to eliminating illegal images of child sexual abuse, because it uses sophisticated cryptography on both Apple’s servers and user devices and doesn’t scan actual images, only hashes.

But many privacy-sensitive users still recoil from software that notifies governments about the contents on a device or in the cloud, and may react negatively to this announcement, especially since Apple has vociferously defended device encryption and operates in countries with fewer speech protections than the U.S.

Law enforcement officials around the world have also pressured Apple to weaken its encryption for iMessage and other software services like iCloud to investigate child exploitation or terrorism. Thursday’s announcement is a way for Apple to address some of those issues without giving up some of its engineering principles around user privacy.

How it works

Before an image is stored in Apple’s iCloud, Apple matches the image’s hash against a database of hashes provided by National Center for Missing and Exploited Children (NCMEC). That database will be distributed in the code of iOS beginning with an update to iOS 15. The matching process is done on the user’s iPhone, not in the cloud, Apple said.

If Apple then detects a certain number of violating files in an iCloud account, the system will upload a file that allows Apple to decrypt and see the images on that account. A person will manually review the images to confirm whether or not there’s a match.

Apple will only be able to review images that match content that’s already known and reported to these databases — it won’t be able to detect parents’ photos of their kids in the bath, for example, as these images won’t be part of the NCMEC database.

If the person doing the manual review concludes the system did not make an error, then Apple will disable the user’s iCloud account, and send a report to NCMEC or notify law enforcement if necessary. Users can file an appeal to Apple if they think their account was flagged by mistake, an Apple representative said.

The system only works on images uploaded to iCloud, which users can turn off, Apple said. Photos or other images on a device that haven’t been uploaded to Apple servers won’t be part of the system.

Some security researchers have raised concerns that this technology could eventually be used to identify other kinds of images, such as photos of a political protest. Apple said that its system is built so that it only works and only can work with images cataloged by NCMEC or other child safety organizations, and that the way it build the cryptography prevents it from being used for other purposes.

Apple can’t add additional hashes to the database, it said. Apple said that it is presenting its system to cryptography experts to certify that it can detect illegal child exploitation images without compromising user privacy.

Apple unveiled the feature on Thursday along other features intended to protect children from predators. In a separate feature, Apple will use machine learning on an child’s iPhone with a family account to blur images that may contain nudity, and parents can choose to be alerted when a child under 13 receives sexual content in iMessage. Apple also updated Siri with information about how to report child exploitation.


AUTHOR COMMENTARY

Patrick Wood, editor Technocracy News & Trends, said this in regards to this announcement:

Apple is now taking on the role of policeman as it starts scanning all of your photos uploaded to its iCloud service. The first wave of extraction is photos representing child exploitation, and they will report you to the police! Their Technocrat AI algorithms will not be made public. You can turn off iCloud on your Apply devices, but you will lose the ability to sync them.

The excuse is to trap child abusers and sexual predators but in the process 100% of your photos must be scanned as well. Once the surveillance door is opened, any other type of photo can be targeted for any other purpose. This could include political protests, location tracking information, psychological profiles, ie., what you are taking pictures of, etc.

I totally agree with his assessment.

More draconian overreach, and loss of freedom and privacy, comes masqueraded in trying to stop evil.

For the transgression of a land many are the princes thereof: but by a man of understanding and knowledge the state thereof shall be prolonged.

Proverbs 28:2

“Coincidently,” Adobe, Twitter, and the New York Times, recently announced a partnership to censor info and data in photo metadata. Welcome to the 2021: the press tells you what is allowed while giving you so-called honest journalism.


[7] Who goeth a warfare any time at his own charges? who planteth a vineyard, and eateth not of the fruit thereof? or who feedeth a flock, and eateth not of the milk of the flock? [8] Say I these things as a man? or saith not the law the same also? [9] For it is written in the law of Moses, Thou shalt not muzzle the mouth of the ox that treadeth out the corn. Doth God take care for oxen? [10] Or saith he it altogether for our sakes? For our sakes, no doubt, this is written: that he that ploweth should plow in hope; and that he that thresheth in hope should be partaker of his hope. (1 Corinthians 9:7-10).

The WinePress needs your support! If God has laid it on your heart to want to contribute, please prayerfully consider donating to this ministry. If you cannot gift a monetary donation, then please donate your fervent prayers to keep this ministry going! Thank you and may God bless you.

CLICK TO DONATE

Leave a Comment

×