Jump to content
  • Amid backlash, Apple will change photo-scanning plan but won’t drop it completely


    Karlston

    • 649 views
    • 5 minutes
     Share


    • 649 views
    • 5 minutes

    Apple issues vague statement promising "improvements" but still plans to scan photos.

    Apple said Friday that it will make some changes to its plan to have iPhones and other devices scan user photos for child sexual-abuse images. But Apple said it still intends to implement the system after making "improvements" to address criticisms.

     

    Apple provided this statement to Ars and other news organizations today:

    Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material [CSAM]. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

    The statement is vague and doesn't say what kinds of changes Apple will make or even what kinds of advocacy groups and researchers it will collect input from. But given the backlash Apple has received from security researchers, privacy advocates, and customers concerned about privacy, it seems likely that Apple will try to address concerns about user privacy and the possibility that Apple could give governments broader access to customers' photos.

    Privacy groups warned of government access

    It isn't clear how Apple could implement the system in a way that eliminates its critics' biggest privacy concerns. Apple has claimed it would refuse government demands to expand photo-scanning beyond CSAM. But privacy and security advocates argue that once the system is deployed, Apple likely won't be able to avoid giving governments more user content.

     

    "Once this capability is built into Apple products, the company and its competitors will face enormous pressure—and potentially legal requirements—from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable," 90 policy groups from the US and around the world said in an open letter to Apple last month. "Those images may be of human rights abuses, political protests, images companies have tagged as 'terrorist' or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them. And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance and persecution on a global basis."

     

    Apple previously announced that devices with iCloud Photos enabled will scan images before they are uploaded to iCloud. Given that an iPhone uploads every photo to iCloud right after it is taken, the scanning of new photos would happen almost immediately if a user has previously turned iCloud Photos on.

     

    Apple has said it will also add a tool to the Messages application that will "analyze image attachments and determine if a photo is sexually explicit." The system will be optional for parents, who can enable it in order to have Apple devices "warn children and their parents when receiving or sending sexually explicit photos."

     

    Apple initially said it would roll the changes out later this year, in the US only at first, as part of updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. Apple's promise to "take additional time over the coming months to collect input and make improvements" suggests the scanning system could be implemented later than Apple intended, but the company never provided a firm release date to begin with.

    Apple called system an advancement in privacy

    As we've previously written, Apple says its CSAM-scanning technology "analyzes an image and converts it to a unique number specific to that image" and flags a photo when its hash is identical or nearly identical to the hash of any that appear in a database of known CSAM. An account can be reported to the National Center for Missing and Exploited Children (NCMEC) when about 30 CSAM photos are detected, a threshold Apple set to ensure that there is "less than a one in one trillion chance per year of incorrectly flagging a given account." That threshold could be changed in the future to maintain the one-in-one-trillion false-positive rate.

     

    Apple has argued that its system is actually an advancement in privacy because it will scan photos "in the most privacy-protecting way we can imagine and in the most auditable and verifiable way possible."

     

    "If you look at any other cloud service, they currently are scanning photos by looking at every single photo in the cloud and analyzing it. We wanted to be able to spot such photos in the cloud without looking at people's photos and came up with an architecture to do this," Craig Federighi, Apple's senior VP of software engineering, said last month. The Apple system is "much more private than anything that's been done in this area before," he said.

     

    Changes to the system could be fought by advocacy groups that have urged Apple to scan user photos for CSAM. Apple partnered on the project with NCMEC, which dismissed privacy criticisms as coming from "the screeching voices of the minority." Apple seemingly approved of that statement, as it distributed it to employees in an internal memo that defended the photo-scanning plan the day it was announced.

     

     

    Amid backlash, Apple will change photo-scanning plan but won’t drop it completely


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...