iOS 15.2 beta introduces app privacy report, children’s communication safety function code surfaced

With the release of the beta version on Wednesday, Apple also mentioned the upcoming child safety feature in iOS 15.2. During the Global Developers Conference (WWDC 2021) in June this year, Apple’s App Privacy Report detailed the access of various apps to user data and device sensor information. On this basis, Apple continues to introduce new tool suites to introduce new levels of software and hardware transparency.

It is reported that the App Privacy Report (App Privacy Report) is located in the Settings -> Privacy section, allowing users to gain insight into the frequency of user location, pictures, cameras, microphones, and contacts accessed by related apps within 7 days.

The apps in the report will be sorted in descending order of time, and the new function can track recent network interactions, revealing the domains accessed directly by the app or through the content loaded in the web view, so that users can deeply trace the events that occurred in the past week.

In addition to the user interface of the application privacy report, the iOS 15.2 beta also contains a detailed code description of the Communication Safety function. MacRumors pointed out that this feature is built into the Messages application, and can use machine learning algorithms to automatically hide image content deemed inappropriate on the device to protect children from pornographic images.

When it is detected that such content has been viewed, the system will send a message to remind the guardians of children under 13 years old, but it is another set of wordings for user groups between 13-17 years old.

Join RealMi Central on Telegram, Facebook & Twitter

The warning messages shared in the report fall into the following categories:

  • Nude photos and videos may be used to infringe the rights of others, and once the content is shared, it cannot be withdrawn.
  • Although your fault is not your fault, sensitive photos and videos can be used to infringe your rights.
  • Even if you get your trust now, the recipient can still share with others permanently without permission.
  • The person who receives the message can share it with anyone, the relevant content may exist in the world, and the sharing itself may also be illegal.

Background Information

  • Communication security is an important measure designed to limit the spread of child abuse materials (CSAM) on Apple’s main platforms.
  • After the announcement in August, Apple will use a three-pronged approach to supervise messaging, Siri search, and photo content stored in iCloud.
  • The principle is to add tags to user photos uploaded to iCloud and match them with the known CSAM hash database.
  • But after being strongly opposed by industry experts and privacy advocates, Apple finally postponed its September release plan.

Leave a Comment