- By Henry Faulkner Brittany
- June 27, 2022
Apple New Feature: Apple aims to roll out the photo checking system that will screen the photos for child abuse imagery, country by country basis before they are uploaded from iPhones in the United States to its iCloud storage.
On Friday, Apple announced to roll out the photo checking system for such images depending upon the local laws. A day earlier, Apple declared that the implementation of a photo checking system will be operational before the upload of child abuse images.
Keeping these measures upfront, the Child abuse safety group appreciates Apple as it collaborated with Facebook, Microsoft, Alphabet’s Google is taking such measures.
But the Apple new feature- the Apple photo checking system will operate before the images are uploaded to the servers that raised concerns that such scrutinization could involve the government exploiting their devices.
In a media briefing, Apple said that it would make plans to extend the service country by country basis following the laws of each country where it operates.
The company introduced another feature in its system called ‘safety vouchers’ that will prevent the government to exert pressure on the company for probing material other than child abuse images. ‘Safety vouchers’ passed from the iPhone to Apple’s servers that do not contain useful data and it will ensure the safety of user’s data.
In addition to this, Apple also operates a human review process that will stop the government from identifying data other than child abuse material. Law enforcement won’t acquire any data from the company’s photo checking system if there is no child abuse element in it.