Apple releases FAQ downplaying privacy concerns over new child protection system as watchdogs warn of overreach
Apple pushed back against criticism that its new anti-child sexual abuse detection system could be used for âbackdoorâ surveillance. The company insisted it wonât âaccede to any governmentâs request to expandâ the systemâs scope.
The new plan, announced last week, includes a feature that identifies and blurs sexually explicit images received by children using Appleâs âMessagesâ app â" and another feature which notifies the company if it detects any Child Sexual Abuse Material (CSAM) in the iCloud.
The announcement sparked instant backlash from digital privacy groups, who said it âintroduces a backdoorâ into the companyâs software that âthreatens to undermine fundamental privacy protectionsâ for users, under the guise of child protection.
Also on rt.com Snowden joins battle against iPhone photo-scanning plan as Apple insults privacy activists as âscreeching voices of the minorityâIn an open letter posted on GitHub and signed by security experts, including former NSA whistleblower Edward Snowden, the groups condemned the âprivacy-invasive content scanning technologyâ and warned that the features have the âpotential to bypass any end-to-end encryption.â
After an internal memo reportedly referred to the criticism as the âscreeching voices of the minority,â Apple on Monday released an FAQ about its âExpanded Protections for Childrenâ system, saying it was designed to apply only to images uploaded to iCloud and not the âprivate iPhone phone library.â It also will not affect users who have iCloud Photos disabled.
The system, it adds, only works with CSAM image hashtags provided by the National Center for Missing and Exploited Children (NCMEC) and âthere is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC.â
âImage hashtagsâ refers to the use of algorithms to assign a unique âhash valueâ to an image â" which has been likened to a âdigital fingerprintâ making it easier for all platforms to remove content deemed harmful.
While Apple insists it screens for image hashes âvalidated to be CSAMâ by child safety organizations, digital rights watchdog Electronic Frontier Foundation (EEF) had previously warned that this would lead to âmission creepâ and âoverreach.â
Also on rt.com Apple to scan photos on all US iPhones for âchild abuse imageryâ as researchers warn of impending â1984â â" reportsâOne of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of âterroristâ content that companies can contribute to and access for the purpose of banning such content,â the non-profit warned last week, referring to the Global Internet Forum to Counter Terrorism (GIFCT).
Apple countered that, because it âdoes not add to the set of known CSAM image hashes,â and because the âsame set of hashesâ are stored in the OS of every iPhone and iPad users, it is ânot possibleâ to use the system to target users by âinjectingâ non-CSAM images into it.
âLet us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any governmentâs request to expand it,â the company vows in its FAQ.
âWe have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,â it added.
However, the company has already been criticized for using âmisleading phrasingâ to avoid explaining the potential for âfalse positivesâ in the system â" the âlikelihoodâ of which Apple claims is âless than one in one trillion [incorrectly flagged accounts] per yearâ.
Hereâs an example of what I mean about misleading phrasing. Apple says this system reports CSAM (true!) and it doesnât report on photos that are exclusively on-device and arenât synced to iCloud (also true!). But what about false positives for photos that *are* synced to iCloud? pic.twitter.com/E7zg5kHLnj
â" Jonathan Mayer (@jonathanmayer) August 9, 2021Like this story? Share it with a friend!
0 Response to "Apple releases FAQ downplaying privacy concerns over new child protection system as watchdogs warn of overreach"
Post a Comment