PvNew | Internet Celebrity Wiki

Apple Says Its iCloud Child-Porn Scanning System Won’t Trigger Alerts Until It Detects At Least 30 Images

  2024-03-01 varietyTodd Spangler41720
Introduction

Apple announced new details about its plan to scan users’ iCloud photos for child pornography, as the tech giant remains

Apple Says Its iCloud Child-Porn Scanning System Won’t Trigger a<i></i>lerts Until It Detects At Least 30 Images

Apple announced new details about its plan to scan users’ iCloud photos for child pornography, as the tech giant remains in damage control after a backlash over the privacy implications of the initiative.

Last week, Apple said it will implement a feature to detect photos stored in iCloud Photos that depict sexually explicit activities involving children, as part of iOS 15 and iPadOS 15 updates due out this fall.

On Friday, the company — seeking to reassure customers that it is minimizing the chance the system might inaccurately flag an account as harboring child pornography — said it will not take action until the system has identified at least 30 matching child sexual abuse material (CSAM) images. It also said the system will be publicly auditable by third parties.

Apple held a briefing with reporters and published a new document, “Security Threat Model Review of Apple’s Child Safety Features,” with more information about its plans.

Why wouldn’t Apple report an iCloud account if it detected even just one known child-porn image?

The initial 30-image threshold “contains a drastic safety margin reflecting a worst-case assumption about real-world performance” to minimize the possibility of false positives, Apple says in the new document.

Over time, Apple may lower that threshold as it monitors the performance of its CSAM system. But the company said “the match threshold will never be lower than what is required to produce a one-in-one-trillion false positive rate for any given account.”

Apple insisted the CSAM system was designed with privacy in mind. The company will use a device-local, hash-based matching system to detect child sexual abuse images in a database of known child-porn photos from two or more child-safety agencies in different countries.At the 30-image threshold of matches, Apple will verify that the images are child pornography and will then report users to the National Center for Missing and Exploited Children (NCMEC) and U.S. law enforcement agencies.

Also Friday, Apple tried to clarify that the system cannot be altered by malicious actors. “Surreptitious modifications to the CSAM hash database” — “inadvertently, or through coercion” — would be “insufficient to cause innocent people to be reported with this system,” the company said. Apple said it will refuse all requests to add non-CSAM images to its hash database, which comprises only entries that represent “the intersection of hashes from at least two child safety organizations operating in separate sovereign jurisdictions.”

Other tech companies, including Google, Facebook and Microsoft, have scanned images uploaded to their cloud services to check for CSAM for several years.

Following Apple’s initial announcement, privacy advocates have raised red flags about Apple’s plan to scan iPhone users’ photos. The Electronic Freedom Foundation, for one, warned that it will open a “backdoor” to potential governmental abuse and invasion of privacy.

“To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again,” EFF’s director of federal affairs India McKinney and senior staff technologist Erica Portnoy wrote in a blog post. “Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.”

Craig Federighi, Apple SVP of software engineering, defended the CSAM scanning system in a Wall Street Journal interview published Friday.“This isn’t doing some analysis for, ‘Did you have a picture of your child in the bathtub?’ Or, for that matter, ‘Did you have a picture of some pornography of any other sort?'” Federighi told the Journal. “This is literally only matching on the exact fingerprints of specific known child pornographic images.”

In addition to the photo-scanning system for child porn, Apple is introducing an opt-in feature with the iOS 15 update that will add tools to the iPhone’s Messages app to warn children and their parents if they are receiving or sending sexually explicit photos. That’s separate from the CSAM scanning system, and Apple says the feature uses on-device machine learning to analyze image attachments and determine whether a photo is sexually explicit (so that Apple is unable to access the messages).

“When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo,” Apple says. “As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos.”

In the Journal interview, Federighi admitted that Apple’s announcement of the CSAM iCloud scanning and monitoring of kids’ Message apps for sexual content caused “confusion,” as some people assumed that Apple was going to track their private messages for child porn. “We wish that this would’ve come out a little more clearly for everyone because we feel very positive and strongly about what we’re doing,” the exec said.

A third child-safety feature in iOS 15 will provide updates to Siri and Search to “provide parents and children expanded information and help if they encounter unsafe situations.” Siri and Search will intervene when users try to search for child sexual abuse material, displaying prompts that will “explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue,” Apple said.

(By/Todd Spangler)
 
 
Dislike 0 Report 0 Favorite 0 Awards 0 Comments 0
0 itemsRelated comments
 

(c)2019-2024 PvNew All Rights Reserved |