From bgr.com

Apple stops developing CSAM detection system for iPhone users

Published Dec 7th, 2022 1:56PM EST

Apple stops developing CSAM detection system for iPhone users

Last year, Apple announced that iCloud Photos would be able to detect inappropriate material in users’ photos based on a database of Child Sexual Abuse Material (CSAM) image hashes. While Apple wouldn’t see these photos since it would use on-device processing, it generated a lot of criticism from privacy and security researchers.

Now, after announcing a new Advanced Data Protection for iCloud, the company’s executive Craig Federighi has confirmed that Apple will not roll out the CSAM detection system for iPhone users as the company has stopped developing it.

The information was confirmed during an interview with The Wall Street Journal‘s Joanna Stern. At the time Apple announced the CSAM detection system, privacy and security researchers understood that it could be misused by governments or hackers to gain access to sensitive information on the phone. With that, Federighi announced Apple’s change of plans:

Mr. Federighi said Apple’s focus related to protecting children has been on areas such as communication and giving parents tools to protect children in iMessage. “Child sexual abuse can be headed off before it occurs,” he said. “That’s where we’re putting our energy going forward.”

For example, through its parental-control software, Apple can notify parents who opt in if nude photos are sent or received on a child’s device, but it will no longer develop a system to detect this inappropriate material in users’ photos.

iCloud Advanced Data ProtectionImage source: Apple Inc.

Apart from that, Apple announced today three important features coming to iPhone users in 2023: Advanced Data Protection for iCloud with 23 data categories now being totally end-to-end encrypted, Security Key, which let people use third-party hardware as a two-authenticator-factor, and iMessage Contact Key Verification so some users can discover whether who’s trying to reach them on iMessage are hackers or not.

“As customers have put more and more of their personal information of their lives into their devices, these have become more and more the subject of attacks by advanced actors,” said Craig Federighi, Apple’s senior vice president of software engineering, in an interview.  Some of these actors are going to great lengths to get their hands on the private information of people they have targeted, he said.

Apple stops developing CSAM detection system for iPhone users

José is a Tech News Reporter at BGR. He has previously covered Apple and iPhone news for 9to5Mac, and was a producer and web editor for Latin American broadcaster TV Globo. He is based out of Brazil.

The post Apple stops developing CSAM detection system for iPhone users first appeared on bgr.com

New reasons to get excited everyday.



Get the latest tech news delivered right in your mailbox

Apple stops developing CSAM detection system for iPhone users

5 Reasons Why You Should Try Online Horse Race Betting

In many places around the world, horse races are an attraction that a lot of people love to watch. With the fast-paced action and thrill that each game provides, it is no longer surprising to know that millions of fans have grown fond of it.
Apple stops developing CSAM detection system for iPhone users

NordLayer — more than a business VPN

Cybersecurity threats have become vast and more sophisticated. The rate of malware attacks and malicious activity counts within seconds despite the size or sector the organization belongs to — no one is safe enough to expect that foe actors will bypass vital company resources.

Apple stops developing CSAM detection system for iPhone users

You may also like

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments

More in Apple