Can't, or won't?
Meta can’t stop paedos from using Messenger, Pay to buy child sexual abuse material
Paedophiles are using Meta’s Facebook Messenger and Meta to buy and sell child sexual abuse material, mainly videos. An investigation into various court proceedings revealed that on many occasions, Meta failed to flag suspicious messages and activities, whereas other chat services did.
Meta has often been accused of harbouring paedophiles and not being strict enough on child abusers, and CSAM content. Image Credit: AFP
Meta is falling short of its own standards when it comes to curbing the dissemination of CSAM or child sexual abuse material. A recent investigative report by The Guardian has revealed that paedophiles and child abusers often use Meta’s Facebook Messenger and Meta Pay to buy and sell videos and images of children getting abused.
The investigation started off authorities in the US state of Pennsylvania, arrested one Jennifer Louise Whelan in November 2022 on multiple charges, including sex trafficking and indecent assault involving three young children, some of whom were as young as six. Whelan would create photos and videos of her abusing the children and then would sell the content to paedophiles via Facebook Messenger.
Authorities came to know of Whelan when one Brandon Warren was indicted in February 2022. Warren was accused of distributing explicit material involving minors. Warren, like Whelan, pleaded not guilty to the charges.
Both Whelan and Warren were using Meta’s Facebook Messenger to send and receive the content. They also used Meta Pay for financial transitions in lieu of the abuse materials.
Meta Pay, formerly Facebook Pay, is a simple peer-to-peer payment service integrated with Meta’s social networks.
Court documents reveal that Meta failed to flag Whelan and Warren’s activities. Instead, it was Kik Messenger, a different messaging app that first reported Warren’s suspicious uploads to authorities, which triggered a police investigation in West Virginia.
Subsequent findings led to the discovery of videos and images allegedly purchased from Whelan via Facebook Messenger.
Former Meta content moderators claim they observed suspicious transactions related to child sex trafficking via Meta Pay but lacked avenues to report them to compliance teams. Former moderators also note the ease of using Meta Pay within Messenger, facilitating potentially illicit transactions. Despite this, Meta’s systems reportedly do not flag such transactions, especially those involving relatively small amounts of money.
Meta Pay, as a money services business, is subject to US anti-money laundering regulations. Failure to detect and report illicit transactions could constitute violations of these laws.
Experts highlight the need for better detection mechanisms, especially given the visibility social media platforms have into users’ activities.
The siloed nature of Meta’s operations further complicates the situation. Former moderators highlight their inability to communicate internally about suspicious transactions they encounter.
As scrutiny intensifies, questions arise about Meta’s effectiveness in combating illicit activities facilitated through its platforms. The implications extend beyond regulatory compliance, touching on broader issues of child safety and corporate responsibility.
It seems corporations have only one responsibility, to their shareholders. The days of good corporate citizenship have long passed by.
=======================================================================
No comments:
Post a Comment