Everyday thousands of children are being sexually abused. You can stop the abuse of at least one child by simply praying. You can possibly stop the abuse of thousands of children by forwarding the link in First Time Visitor? by email, Twitter or Facebook to every Christian you know. Save a child or lots of children!!!! Do Something, please!

3:15 PM prayer in brief:
Pray for God to stop 1 child from being molested today.
Pray for God to stop 1 child molestation happening now.
Pray for God to rescue 1 child from sexual slavery.
Pray for God to save 1 girl from genital circumcision.
Pray for God to stop 1 girl from becoming a child-bride.
If you have the faith pray for 100 children rather than one.
Give Thanks. There is more to this prayer here

Please note: All my writings and comments appear in bold italics in this colour

Friday, 3 September 2021

Apple Puts CSAM Scan on Back Burner as Many Groups Complain Except Children

..

Apple delays controversial plan to scan iPhones for

child abuse images following privacy backlash

3 Sep, 2021 14:34

©  REUTERS/Mike Segar/File Photo


Apple has announced it will “take additional time” in the coming months to work on plans for flagging child sexual abuse material (CSAM), amid concerns from activists and rights groups over censorship and privacy issues.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” an Apple spokesperson said in a statement on Friday.

Did you get any feedback from the children? Of course, you didn't; children are voiceless, unorganized, and completely at your mercy, which seems to be somewhat lacking right now. For a while there I thought you were on their side, speaking up for them. Does the ACLU have any history of protecting children. They seem to be protecting the rights of pedophiles here. I'm so disappointed in you Apple.

The delay follows a controversial announcement that was immediately met with calls to abandon the plans from civil rights groups, including the American Civil Liberties Union (ACLU).

Apple’s technology would scan photos and conversations for CSAM, using a program the company previously claimed would still protect individual privacy because the technology does not identify the overall details of a picture or conversation, or need to be in possession of either – though many critics have voiced their doubts.

The system uses a database of references or ‘image hashtags’ to recognize specific content to be flagged, though security experts have warned that such technology could likely be manipulated, or innocent images could be misinterpreted. 

Even Apple employees have reportedly expressed concerns with the detection technology, worrying that it could be used to work around encryption protections, that it could easily misidentify and flag some photos – or even that some governments could exploit it to find other material. Apple maintains that it will refuse any requests from governments to use the system for anything other than child abuse images.

“iMessages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent,” read a letter from a coalition of more than 90 activist groups to Apple CEO Tim Cook on the potential changes. 

The exact timeline for the current delay is unknown, but the new detection system was originally intended to be in use sometime this year.

=====================================================================================


No comments:

Post a Comment