​Apple’s proposed new child safety features and the problems with privacy tradeoffs

Lumen Database Team
5 min readSep 8, 2021
“Apple Rally 2.23.16” by Electronic_Frontier_Foundation is licensed under CC BY-NC 2.0

On September 3, 2021, Apple paused the implementation of two highly controversial new child safety features that would both algorithmically and manually surveil the devices of Apple users for Child Sexual Abuse Material (CSAM).

The first of these opt-in features would blur the sexual images sent or received through iMessage by users under 18 and would also alert the parents of children under 12 if the image was viewed. The devices owned by users under 18 will scan incoming and outgoing messages through an image classifier trained on ‘pornography’. The second feature would use AI to scan on-device images that are being backed up on the iCloud and search for CSAM images that match with the ones in the government-maintained database of known CSAM images. If a match is found, a manual review would be carried out before flagging the user to law enforcement agencies. While Apple was already scanning iCloud emails for CSAM, images stored on a user’s device were not scanned before.

Not long after this proposal was made public, Apple, which has long been a proponent of privacy in the past, received enormous backlash from civil society members, advocacy groups, and customers. The backlash was centered around the privacy and security concerns that would arise because of Apple’s backdoor decryption policy and the fear that it would set a dangerous precedent for different kinds of materials that governments could illegally scan user’s devices for. As EFF noted, “The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content or an authoritarian regime might demand the classifier be able to spot popular satirical images of protest flyers.” EFF’s petition for a roll-back of these features has received over twenty-five thousand signatures.

While privacy activists applauded the delay in implementation caused by their resistance, the delay led to concern and dismay for child protection agencies, who saw the controversial new CSAM feature as a significant step forward in keeping children safe from abuse online and one that could eventually be made standard industry practice. Supporters of the new child safety features believed that scanning of private messages seemed to be a ‘proportionate’ tradeoff in exchange for warranting child safety offline.

This perceived understanding that tradeoffs between privacy and other important aspects of life such as medical health and child safety are acceptable may be problematic, albeit common.

At the start of the COVID 19 pandemic, the tradeoff between privacy and the right to health seemed like an obvious one to many countries (and citizens) too. The notion that sacrificing user privacy through contract tracing and surveillance applications would curb the spread of COVID 19 led to several nations partnering with private companies to build surveillance tools tracking the movement of their citizens. However, just a few months in, the effectiveness and efficiency of such applications were questioned and challenged. Personal data gathered through these platforms was also misused for carrying out mass surveillance.

Similarly, in the aftermath of 9/11 in the US, a tradeoff between privacy and national security took place in which various surveillance tools were implemented that dramatically compromised the privacy of US citizens, ostensibly to improve law enforcement’s ability to surveil and identify terrorism, and therefore improve public safety. However, the new agencies, laws, regulations, rules, recommendations that were put in place for heightened national security were also (ab)used to enable substantial new surveillance of US citizens, and to erode various privacy protections, in addition to having relatively small demonstrable positive effects on safety.

While Apple maintains that it would refuse to cede to US government requests to add other categories of images to the database against which user images are matched, it has several times in the past made concessions to foreign governments so that it can continue operating in those countries. For instance, it sells iPhones without FaceTime in Dubai, where encrypted phone calls are not allowed, and removed thousands of applications from its App Store in China so that it can continue to make sales in the country. In 2020, Apple also dropped plans for encrypting iCloud backups after pressure from the FBI.

Considering the evidence showcasing that willingly (and knowingly) compromising user privacy in the name of medical health or national security not only has not produced the promised outcomes but has led to the predicted (ab)uses, both legal and extra-legal, there may be a reason to re-evaluate whether making a similar trade-off for children safety online will be an effective measure. This is because once this feature is made industry practice, “good intentions” would not be sufficient to ensure that the backdoor created would not be used for overbroad purposes too.

In addition to the potential overbroad uses to which such a feature can be put, the efficacy of the underlying technology has also been challenged. Coders have already been able to find ways to change the mathematical output of an image, without changing how the image itself looks. Such a flaw could undo the only proposed benefit of the scanning system — identifying CSAM since it can potentially be used to alter entire libraries to make them invisible to Apple’s scanning system or to alter innocuous photos so that they trigger the system.

So far it looks like Apple is enabling decryption and on-device tracking without there being any concrete proof that this will be a watertight means to ensure CSAM images aren’t shared and at the same time, there being ample proof that such de-encryption and on-device tracking has a history of being misused.

In 2015, Tim Cook opened a Champion of Freedom event hosted in Washington by noting, “We at Apple reject the idea that our customers should have to make tradeoffs between privacy and security,” Cook said, “We can, and we must provide both in equal measure. We believe that people have a fundamental right to privacy. The American people demand it, the Constitution demands it, morality demands it. “

For now, Apple has said it will delay this project until it has collected input and feedback from customers, advocacy groups, researchers, and others who objected to its implementation. Although it is yet to be seen whether Apple will be able to conform with Cook’s statement by either abandoning the implementation of the proposed features entirely or alternatively, by ensuring that the feature provides privacy and security in equal measure. Alternatively, if Apple chooses to roll out the features as proposed currently, it may set an industry-wide standard that showcases a privacy tradeoff as a necessity for ensuring children’s safety online, despite the historical evidence being stacked against this notion.

About the Author: Shreya is an Employee Fellow at the Berkman Klein Center, where she works on the Lumen Project. She is a passionate digital rights activist and uses her research and writing to raise awareness about how digital rights are human rights. She tweets at @shreyatewari96

--

--

Lumen Database Team

Collecting and facilitating research on requests to remove online material. Visit lumendatabase.org and email us if you have questions.