These areas are communication safety in Messages, CSAM detection, and expanding guidance in Siri and Search.

This program is ambitious, and protecting children is an important responsibility.

These efforts will evolve and expand over time.

Apple

This will also enable parents to play a more informed role in helping their children navigate communication online.

Similar protections are available if a child attempts to send sexually explicit photos.

CSAM Detection

CSAM refers to content that depicts sexually explicit activities involving a child.

spot_img

The iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online.

Further, Apples method of detecting known CSAM is designed with user privacy in mind.

Apple further transforms this database into an unreadable set of hashes that is securely stored on users devices.

Even in these cases, Apple only learns about images that match known CSAM.

Siri and Search will also intervene when users venture to search for CSAM-related topics.

source: www.techworm.net