Apple’s Child Sexual Abuse Detection System Stirs Privacy Debate

Apple's Child Sexual Abuse Detection System

Users claim that the Child Sexual Abuse Material (CSAM) detection system compromises users’ privacy and allows censorship, surveillance and oppression.

On August 6, 2021, Apple announced the launch of a new feature called the Child Sexual Abuse Material (CSAM) detection system. While the feature is a step in the right direction, users are not happy with what it entails. It will allow Apple to scan all the photos uploaded on a user’s iCloud. Then it will flag the images that look similar to the child sexual abuse images in the company’s database. In case a user has enabled end-to-end encryption, Apple will scan their iMessages. Once they collect at least 30 suspicious photos from a single user, they will cross-check with their database and report it to the National Center for Missing and Exploited Children (NCMEC) for further action.

Two Princeton academics Jonathan Mayer and Anunay Kulshrestha wrote the only peer-reviewed publication on how one can build a system like Apple’s CSAM. They concluded that the technology was dangerous. According to them, with a system like this, people could create “false positives and malicious users could game the system to subject innocent users to scrutiny.” They warned people against using their own system design. Though the feature has positive intentions, its approach – accessing users’ messages and tracking their images – is worrying iPhone users. The CEO of Whatsapp Will Cathcart also expressed his displeasure with the move. In a tweet, he called it a “wrong approach and a setback for people’s privacy all over the world.”

In light of this, over 90 policy groups from around the world signed an open letter asking Apple to drop this plan. The letter noted, “Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”

Additionally, people are concerned that by creating such a system, Apple has opened the door for governments wanting to regulate citizens under the pretext of keeping them safe. The Cybersecurity Director at the Electronic Frontier Foundation Eva Galperin told the New York Times, “Once you build a system that can be aimed at any database, you will be asked to aim the system at a database.” This concern is well-founded given that Apple agreed to shift the personal data of its Chinese users to the servers of a state-owned firm, as per the government’s request. Still, Apple released a document saying that they will not succumb to any government pressures to abuse this system. They said, “Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.”

Apple’s Senior Vice President of Software Engineering Craig Federighi has chalked these concerns up to “misunderstandings”. In an interview, he explained, “It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood. We wish that this would’ve come out a little more clearly for everyone because we feel very positive and strongly about what we’re doing.”

Despite widespread requests to not go ahead with this, Apple maintains that their system is “much more private than anything that’s been done in this area before.” Apple will launch the CSAM detection system later this year.

Header Image by Unsplash

SHARE THIS STORY

Share on facebook
Share on twitter
Share on linkedin
Share on email
Alinda Gupta
I am a professional journalist and gourmand with an inexplicable love for caffeine. I admire old architecture and find comfort in fiction books. I am also an A1-level certified French speaker—bonne journée!

RELATED POSTS

How to Invest in the Cannabis Industry

How to Invest in the Cannabis Industry

Despite cannabis (also known as weed or marijuana) being illegal in a large part of the world, the global cannabis market was worth US$28.26 billion in 2021 and is expected to grow to US$197.74 billion by 2028. Gone are the days when venture capital firms would dismiss cannabis companies as problematic investments.

Why Crypto Markets Crash and 5 Ways Investors Can Deal

Why Crypto Markets Crash and 5 Ways Investors Can Deal

With more and more people holding cryptocurrencies today, the crypto crash of May 2022 has had severe financial consequences. Reliable currencies, including Bitcoin and Ether, met a terrible fate, as did stablecoins, amounting to losses of over US$300 billion.

How Lemi Is Helping Small Businesses Reach Their Customers

How Lemi Is Helping Small Businesses Reach Their Customers

Starting a business isn’t easy. From finding raw materials and making your products to actually reaching the right customer base, everything takes a lot of time and energy. Even though this process is so hard, a lot of people venture out and start their own businesses, so much so that small business enterprises (SMEs) make up 90% of the world’s businesses.