Apple Announces It Will Scan User Photos for Child Pornography


Apple has announced steps to limit the spread of Child Sexual Abuse Material (CSAM) which includes the scanning of photos stored by its users.
—–
Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.
Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Spotlight Deal:
Apple Watch Series 6 On Sale for $69 Off [Deal]
Share Article:
Facebook, Twitter, LinkedIn, Email, Reddit, Digg, Delicious, StumbleUpon
Follow iClarified:
Facebook, Twitter, LinkedIn, Newsletter, App Store, YouTube
Celý článok nájdete tu

You Might Also Like