Apple: New child safety feature will be able to find images marked by multiple countries

Recently, Apple launched a new child safety feature that can scan user iCloud photos for pictures related to child sexual abuse on behalf of the government. However, this decision was severely criticized by privacy advocates.

Now, in response to these criticisms, Reuters reports that the tech giant has clarified that it will use this feature to scan images marked by information exchange centers in multiple countries.

The automatic scanning system will only alert Apple when the image exceeds the initial threshold of 30 so that the human reviewer can deal with the problem. The company said that this number will eventually decrease in the future. It also made it clear that its list of image identifiers is universal and is the same for any device to which it is applied.

Apple further stated that its deployment will generate an encrypted hash database of child sexual abuse material (CSAM) on the device, which comes from no less than two or more organizations and operates under the auspices of independent national governments.

The tech giant did not say whether this effect had any impact on its position, but it said it was confused about the initial statement and the project is still under development.

Leave a Comment