Apple scans iCloud child sexual abuse pictures, and there is a lot of opposition from inside and outside

Apple previously announced that it will scan the pictures uploaded by users to iCloud to see if there is any child sexual abuse content. Some Apple employees were worried about the new policy, and they went to Apple’s internal Slack channel to comment. Some employees believe that the government may use the loopholes to find other content for review or arrest. Some protesters worry that Apple’s reputation will be damaged as a result. Apple has been doing a good job in privacy protection.

Others say that the U.S. government cannot legally scan a large number of household devices and cannot allow others to do so, but now that Apple voluntarily scans pictures, this move may cause serious consequences. Insiders said that the coalition of policy-oriented teams will send a letter of protest to Apple within a few days, asking Apple to suspend the plan. In the past 24 hours, the Electronic Frontier Foundation (EFF) and the Center for Democracy and Technology (CDT) have also expressed opposition.

Apple has said that it will not allow the government to use the system to check content on mobile phones other than child sexual content. According to Apple, the scans will only be carried out in the United States at first, and other countries will increase one by one, and only scan images uploaded to iCloud.

Only the National Center for Missing and Exploited Children (National Center for Missing and Exploited Children) and a few others will be scanned. The image that the agency has confirmed. However, critics believe that some countries and courts may require the scope of scanning to be expanded because these countries are so important that it may be difficult for Apple to refuse.

Police or other agencies in some countries may also invoke the law to require Apple to provide technical support for criminal investigations, thereby forcing Apple to expand the coverage of the system. From a technical point of view, expanding surveillance is feasible, and the infrastructure can meet the conditions. The new policy proves this, and Apple has no reason to deny it.

Leave a Comment