This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Data Privacy,
Technology

Sep. 2, 2021

How much is too much to protect your children against sexual predators?

When Apple announced last month that the upcoming release of iOS 15 would include two new safety features — Communication Safety and CSAM (Child Sexual Abuse Material) Detection — each aimed at combating child sexual predators, the backlash paralleled the same argument highlighted in 2016.

Svetlana McManus

Associate, Lewis Brisbois Bisgaard & Smith LLP

Kamran Salour

Partner, Troutman Pepper Hamilton Sanders LLP

Email: kamran.salour@troutman.com

Kamran is a partner in the Consumer Financial Services practice and is a member of the Cybersecurity, Information Governance and Privacy group at Troutman Pepper.

Apple's staunch refusal to create a special source code to ensure the iPhone the San Bernardino terrorist used did not auto delete content headlined much of early 2016. And like most of the 2016 headlines, Apple's adamant refusal to assist law enforcement by creating a "backdoor" and decrypting that information has quickly been forgotten.

Therefore, when Apple announced last month that the upcoming release of iOS 15 would include two new safety features -- Communication Safety and CSAM (Child Sexual Abuse Material) Detection -- each aimed at combating child sexual predators, the backlash paralleled the same argument highlighted in 2016. Although this time, Apple's opposition echoes the sentiments of its allies from five years earlier.

Is the opposition based on a misunderstanding of how the features will operate? Or is it a genuine response to a legitimate fear that the creation of these new technological features foreshadows Apple's encroachment into a once impenetrability realm of privacy?

Before analyzing the merits of the opposition, it is important to understand the capabilities of Apple's Communication Safety and CSAM Detection features embedded in the iOS 15 update.

Communications Safety

Communications Safety is intended to provide parents and children with additional tools to protect children from sending and receiving sexually explicit images in Messages on an Apple device. The Communications Safety feature permits parents/guardians to enable parental notifications for accounts belonging to children 12 and under. Once enabled, if a child 12 or under sends or receives a sexually explicit image through Messages, the child receives an automated warning. The automated warning will provide the child with a choice to view the image and a warning that if the child continues to view or send the sexually explicit image, the child's parents will receive a notification alerting them to the activity.

Th Communications Safety feature only analyzes images sent through Messages; it does not analyze the contents of any messages. Furthermore, Apple never gains access to the messages, image evaluations, or notifications sent to the children or their parents.

CSAM Detection

The CSAM Detection feature is intended to identify when a user of an iOS device attempts to store any known CSAM images in the iCloud. Organizations such as the National Center for Missing and Exploited Children have databases of CSAM. Through CSAM Detection, Apple relies on cryptography to analyze unreadable hashes that are stored on a user's device and compares those hashes to image hashes of CSAM from such databases. Apple is only notified if the user attempts to store known CSAM images in the iCloud. Apple estimates the probability of falsely identifying a CSAM image stored in the iCloud is 1 in 1 trillion.

The CSAM Detection feature only impacts users that store CSAM photos in the iCloud. The feature does not work on a user's private iPhone photo library stored on a user's device. CSAM Detection also does not scan photos stored on any user's Apple device.

Opposition to Apple's Communications Safety and CSAM Detection Features

Both government and industry have acknowledged the need to balance an individual's right to privacy and society's interest in providing law enforcement with the tools required to maintain safety. At first glance, Apple's new safety features appear to balance protecting user privacy while mitigating the spread of sexually explicit images to children and the propagation of CSAM. Besides, Apple affords users a choice: users can opt-out of the Communications Safety feature and users simply need to not use iCloud to store known CSAM images.

Nevertheless, for some individuals, the notion that Apple will scan the contents of their iOS device -- regardless of the actual data stored -- is by itself an invasion of privacy.

For others, however, the opposition has less to do with these new safety features and more to do with the answered question: what comes next? Just five years ago, Apple was unwilling develop special code to provide the FBI with a backdoor to a single Apple device, even though doing so would assist the FBI's fight against terrorism. Apple refused because doing so would threaten data security, resulting in an imbalance between privacy rights and society's interest.

Now, unlike the backdoor debate of 2016, Apple believes that the implementation of these new safety features will properly balance user privacy and society's interest to stop the spread of sexually explicit images to minors and the storage of CSAM. Some genuinely fear that Apple's decision to allow for on-device scanning will result in unintended consequences, such as Apple broadening the scope of on-device scanning to allow for the sharing of user data with law enforcement without valid warrants. Or worse, an unauthorized actor's manipulation of Apple's legitimate on-device scanning for malicious purposes.

While Apple has refused to create a backdoor, it has voluntarily opened the door to on-device scanning, notwithstanding the resulting potential privacy concerns. What else then is Apple willing to do?

#364071


Submit your own column for publication to Diana Bosetti


For reprint rights or to order a copy of your photo:

Email jeremy@reprintpros.com for prices.
Direct dial: 949-702-5390

Send a letter to the editor:

Email: letters@dailyjournal.com