Mon | Aug 9, 2021 | 2:21 PM PDT

The debate about privacy relative to crime fighting was around well before the first iPhone.

We all want our personal data to be secure and private. But what about the data and privacy of criminals who target children?

In a major move, Apple is introducing new technology that aims to help protect children from predators and limit the spread of Child Sexual Abuse Material (CSAM), naming it Expanded Protections for Children.

Apple will be introducing the technology with three focus points:

  • "New communication tools will enable parents to play a more informed role in helping their children navigate communication online."
  • "iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy."
  • "Updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations."

The move may be well-intentioned, keeping the safety of children in mind, but to what degree will the privacy of its users be compromised, and is that compromise worth it?

Open letter against new Apple technology

Since Apple's announcement, security and privacy experts from around the world have chimed in to share their concerns over the new technology. 

Over 6,000 individuals have signed an open letter calling for Apple to halt the deployment of this new content monitoring program. Here is what the letter says is wrong with the technology:

"Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.

Apple's proposed technology works by continuously monitoring photos saved or shared on the user's iPhone, iPad, or Mac. One system detects if a certain number of objectionable photos is detected in iCloud storage and alerts the authorities. Another notifies a child's parents if iMessage is used to send or receive photos that a machine learning algorithm considers to contain nudity.

Because both checks are performed on the user's device, they have the potential to bypass any end-to-end encryption that would otherwise safeguard the user's privacy."

Apple markets the privacy of its products as a differentiator. One ad campaign's slogan is simply "Privacy. That's iPhone." The company is claiming to be a leading defender in end-to-end encryption, meaning it doesn't store the content of messages or other data on its servers.

Greg Nojeim, Co-Director of the Center for Democracy and Technology's Security & Surveillance Project, is "deeply concerned" with the new technology:

"Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the U.S., but around the world. Apple should abandon these changes and restore its users' faith in the security and integrity of their data on Apple devices and services."

Dr. Nadim Kobeissi, a researcher in security and privacy issues, expressed his concerns:

"Apple sells iPhones without FaceTime in Saudi Arabia, because local regulation prohibits encrypted phone calls. That's just one example of many where Apple's bent to local pressure. What happens when local regulations in Saudi Arabia mandate that messages be scanned not for child sexual abuse, but for homosexuality or for offenses against the monarchy?"

The Electronic Frontier Foundation believes Apple is "opening the door to broader abuses":

"It's impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger's encryption itself and open the door to broader abuses. That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change."

For more quotes from security experts and to view the list of organizations and individuals that have signed the open letter, see: An Open Letter Against Apple's Privacy-Invasive Content Scanning Technology.

Apple says child protection technology is safe

Despite the overwhelming concerns from the security and privacy community, Apple believes that it is not compromising the privacy of its users with this new technology.

New tools will be added to the Messages app, warning children and their parents when receiving or sending sexually explicit photos. It says this feature is designed with user privacy in mind:

"Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages."

Another important aspect of the technology is that it will limit the spread of child pornography online. Apple will be able to detect Child Sexual Abuse Materia (CSAM) images stored on iCloud and then alert the National Center for Missing and Exploited Children (NCMEC).

However, Apple says it can do all this and still maintain privacy:

"Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users' devices."

Apple will also use a technology called threshold secret sharing, meaning users must cross a threshold of known CSAM content before Apple can view iCloud photos. It claims there is a one in one trillion chance per year of incorrectly flagging a given account.

Read Apple's statement for more information on the company's Expanded Protections for Children technology.

One thing is for sure, people will be talking about this debate at SecureWorld conferences this year.

Comments