..
Apple delays controversial plan to scan iPhones for
child abuse images following privacy backlash
3 Sep, 2021 14:34
Apple has announced it will “take additional time” in the coming months to work on plans for flagging child sexual abuse material (CSAM), amid concerns from activists and rights groups over censorship and privacy issues.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” an Apple spokesperson said in a statement on Friday.
Did you get any feedback from the children? Of course, you didn't; children are voiceless, unorganized, and completely at your mercy, which seems to be somewhat lacking right now. For a while there I thought you were on their side, speaking up for them. Does the ACLU have any history of protecting children. They seem to be protecting the rights of pedophiles here. I'm so disappointed in you Apple.
The delay follows a controversial announcement that was immediately met with calls to abandon the plans from civil rights groups, including the American Civil Liberties Union (ACLU).
Apple’s technology would scan photos and conversations for CSAM, using a program the company previously claimed would still protect individual privacy because the technology does not identify the overall details of a picture or conversation, or need to be in possession of either – though many critics have voiced their doubts.
The system uses a database of references or ‘image hashtags’ to recognize specific content to be flagged, though security experts have warned that such technology could likely be manipulated, or innocent images could be misinterpreted.
Even Apple employees have reportedly expressed concerns with the detection technology, worrying that it could be used to work around encryption protections, that it could easily misidentify and flag some photos – or even that some governments could exploit it to find other material. Apple maintains that it will refuse any requests from governments to use the system for anything other than child abuse images.
“iMessages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent,” read a letter from a coalition of more than 90 activist groups to Apple CEO Tim Cook on the potential changes.
The exact timeline for the current delay is unknown, but the new detection system was originally intended to be in use sometime this year.
=====================================================================================
No comments:
Post a Comment