author photo
By SecureWorld News Team
Fri | Sep 3, 2021 | 12:10 PM PDT

When Apple announced it was developing tools to search iPads, iPhones, and other devices for child pornography, it was the shot heard around the world in tech.

On the surface, who would not agree to systems bringing justice to child predators?

Underneath, the problems posed by the system were significant and complicated. According to many advocates of privacy protection, the move would lead us down a slippery slope.

The plan sparked suspicion and led to serious questions about Apple's commitment to privacy.

But now Apple is taking a step back. Today, Apple made the decision to delay launching the Child Sexual Abuse Material (CSAM) detection features indefinitely.

Apple shelves launch in favor of more research

In an update to the "Expanded Protections for Children "section of the company's website, Apple posted text in small, italicized font at the top of the page. And they did it the day before the long Labor Day weekend in the U.S.

The update, hinting to why Apple chose to delay the system, reads:

"Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

While Apple still stands by its position that the tool is important, the company will continue to gather data to improve it or address other factors causing complaints.

Growing outrage from advocacy groups, consumers, and more

SecureWorld News reported previously on the Apple child policy, citing  an open letter written to Apple to stop the deployment of this policy. More than 6,000 individuals had signed the petition at the time of the article.

As of September 3, nearly 9,000 people had signed the letter, and many other advocates against the policy came forward in op-eds and public commentary.

Expert researchers at The University of Princeton condemned the system, saying it was "dangerous" and that while intentions were good, it could lead to abuse by, say, repressive government regimes.

The unusual twist to this story? Jonathan Mayer, an assistant professor at Princeton, and graduate researcher Anunay Kulshrestha had been working on a very similar program to address catching child predators.

"We were so disturbed that we took a step we hadn't seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides. We'd planned to discuss paths forward at an academic conference this month," Princeton researchers told The Washington Post.

This morning, Edward Snowden also tweeted in celebration of the delay:

Apple's Head of Privacy, Erik Neuenschwander, told TechCrunch that CSAM efforts are not new, downplaying the method for the greater good and ensuring Apple users privacy protection was pinnacle.

"We have two co-equal goals here. One is to improve child safety on the platform and the second is to preserve user privacy. And what we’ve been able to do across all three of the features is bring together technologies that let us deliver on both of those goals."

Will Apple somehow be able to deliver on both goals in the future? Is that even possible? And which is the most crucial side to deliver on: catching predators or protecting privacy?

Leave your comments below. 

[Learn more on privacy opinions and protocol at one of SecureWorld's upcoming conferences or webinars.]

Tags: Apple, Privacy,
Comments