Apple postpones the CSAM features, there are those who request their definitive removal

In recent days, Apple announced its decision to take more time to implement new security features aimed at combating the spread of child sexual abuse material (CSAM). A decision welcomed by the various civil rights groups who had formally ordered Apple to abandon the development of this feature fearing important repercussions on user privacy.

In particular, the Electronic Frontier Foundation said it was satisfied with this decision even though it would like Apple to go further by abandoning it completely. In a press release, EFF said it was pleased that Apple is now listening to users’ concerns, but must go beyond just listening and completely abandon its plans to insert a backdoor into its encryption.

Apple, we recall, for its part has always defended these features by emphasizing the utmost respect for privacy compared to the technologies used by other companies. Initial intentions were to roll out CSAM features later this year as part of iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. There is currently no information on the possible period in which the company plans to implement them.

The EFF, which has so far collected over 25,000 signatures against Apple, also reiterated the criticisms made by all other citizen groups that these features do not offer an improvement but a decrease in privacy for all users of iCloud Photos as well as being ‌‌usable by authoritarian governments to track dissidents, protesters or refugees.

In light of all this criticism, it is not yet known how Apple intends to improve these features and when it plans to implement them. It would seem to exclude their abandonment given that the Cupertino club considers them of fundamental importance.

Leave a Comment