Revealed: US police prevented from viewing many
online child sexual abuse reports, lawyers say
Social media firms relying on AI for moderation generate unviable reports which prevent authorities from investigating cases
Social media companies relying on artificial intelligence software to moderate their platforms are generating unviable reports on cases of child sexual abuse, preventing US police from seeing potential leads and delaying investigations of alleged predators, the Guardian can reveal.
By law, US-based social media companies are required to report any child sexual abuse material detected on their platforms to the National Center for Missing & Exploited Children (NCMEC). NCMEC acts as a nationwide clearinghouse for leads about child abuse, which it forwards to the relevant law enforcement departments in the US and around the world. The organization said in its annual report that it received more than 32m reports of suspected child sexual exploitation from companies and the public in 2022, roughly 88m images, videos and other files.
Meta is the largest reporter of these tips, with more than 27m, or 84%, generated by its Facebook, Instagram and WhatsApp platforms in 2022. NCMEC is partly funded by the Department of Justice, but it also receives private and corporate donations, including from Meta. NCMEC and Meta do not disclose the size of this donation.
Social media companies, Meta included, use AI to detect and report suspicious material on their sites and employ human moderators to review some of the flagged content before sending it to law enforcement. However, US law enforcement agencies can only open AI-generated reports of child sexual abuse material (CSAM) by serving a search warrant to the company that sent them. Petitioning a judge for a warrant and waiting to receive one can add days or even weeks to the investigation process.
“If the company has not indicated when they report the file to NCMEC that they have viewed the file prior to making the report, we cannot open it,” said Staca Shehan, vice-president of the analytical services division of NCMEC. “When we send it along to law enforcement, they cannot view it or open it without first serving legal process on the [social media company].”
Continue reading this store on the Guardian:
=======================================================
Rochdale grooming: Community 'ripped apart' by abuse scandal
Communities have been "ripped apart" by the failures of local authorities and police to tackle child grooming gangs in Rochdale, a councillor has said.
A report found the widespread abuse of young girls by Asian (read Pakistani) men in the town between 2003 and 2012 was not properly investigated and victims were ignored.
Andy Kelly, the Liberal Democrat group leader at Rochdale Council, said the findings were "heartbreaking".
Rochdale was again "in the spotlight for all the wrong reasons", he added.
The report found young girls were abused by a network of Asian men, while 96 people who pose a potential risk to children remain at large, with most having not been prosecuted.
Deputy mayor of Greater Manchester Kate Green said suspects were being "actively monitored" by police, but added the investigations were "complex".
"Why aren't we arresting them and charging them?", Mr Kelly asked in an interview with BBC Radio Manchester.
"There is not point in pussyfooting around it, we need to get this sorted out."
'Gut-wrenching'
The councillor said the failings exposed in the report had led to a "a real community cohesion issue".
"If you speak to people in the Asian community they feel a sense of partly shame, but partly a kind of a gut-wrenching feeling that they as a community are being blamed for what's going on," he said.
"The abusers in this case are Asian, but not everyone who's Asian is an abuser."
Many feel local authorities had still not done enough to protect vulnerable children, Mr Kelly said.
"We need to absolute nail this issue, get people arrested and prosecuted and get into place as a service and support network for vulnerable young women and girls".
Rochdale Council leader Neil Emmott said those involved in the failings were "long gone".
A multi-agency unit called the Sunrise Team was working to prevent, detected and prosecute abusers, he said.
He added: "We have also educated schools taxi drivers, off licences, takeaways, barbers, licensed premises and local residents about the signs of child sexual exploitation."
Children committing half of reported child sexual abuse offences, new figures reveal
Police say access to violent pornography and smart phones has fuelled the rise of child-on-child abuse, which now make up 52% of recorded child abuse cases.
Wednesday 10 January 2024 07:00, UK
No comments:
Post a Comment