August 9, 2021 - 5:25pm

Not planning to store child porn or send nudes to 11 year olds? Good. But you should still care about Apple’s new initiative.

Two new initiatives, in fact.

One is AI software to detect sexually explicit photographs in the Messages app. Parents will be able to activate it on a child’s iPhone in a Family Sharing account. When activated, it will ask the under-18-year-old if they really want to view or send the image. Under 13s will be warned that, if they do, their parent will be notified. So much for end-to-end encryption guaranteeing that nobody but the sender and recipient know what is sent.

The other innovation will compare photographs uploaded to iCloud storage with a database of known Child Sexual Abuse Material (CSAM). Using a process called hashing that converts images to a reference number, the new system can check for matches without decrypting photos. If enough images from one iPhone match CSAM images in the reference file, however, a human will check the flagged images. Obviously, doing this means Apple will decrypt the images in question.

In fact, Apple can already access photos stored in iCloud, which means they can, if legally required to do so, hand the keys to law enforcement agencies. And they do so, thousands of times every year. Apple sell themselves as a more privacy-friendly tech option, but several years ago they decided against offering encryption for iCloud backup without having a spare set of keys.

Their new plans to introduce client-side scanning adds a back door that could be used for other purposes.

The Global Internet Forum to Counter Terrorism (GIFCT), for example, already has a database of hashes identifying terrorist material. Shared with member organisations including Twitter, Facebook, YouTube and Microsoft, the database enables a similar matching process for rapid or automated removal of content.

The GIFCT reference database has no external oversight and often removes content that should, by most free speech standards, be freely available. Examples cited by the Electronic Frontier Foundation (EFF) include a satirical post mocking the anti-LGBT stance of Hezbollah, and evidence of human rights abuse in Syria, Yemen and Ukraine.

Many countries have passed laws against publishing ‘misinformation’ or ‘fake news’ which are, in effect, a licence to censor journalism and online content. For all Apple’s promises that client-side scanning will not be used for anything except protecting children from exploitation, it’s hard to see how they could resist local legislation that required them to check messages or photographs for forbidden material.

Closer to home, a tech company that sells itself on protecting its customers’ privacy now tells us that there is no such thing as a truly private communication. “Nothing to hide, nothing to fear” has never been a good argument. Our private exchanges should not be subject to routine scrutiny, even by machines.


Timandra Harkness presents the BBC Radio 4 series, FutureProofing and How To Disagree. Her book, Big Data: Does Size Matter? is published by Bloomsbury Sigma.

TimandraHarknes