Everyday thousands of children are being sexually abused. You can stop the abuse of at least one child by simply praying. You can possibly stop the abuse of thousands of children by forwarding the link in First Time Visitor? by email, Twitter or Facebook to every Christian you know. Save a child or lots of children!!!! Do Something, please!

3:15 PM prayer in brief:
Pray for God to stop 1 child from being molested today.
Pray for God to stop 1 child molestation happening now.
Pray for God to rescue 1 child from sexual slavery.
Pray for God to save 1 girl from genital circumcision.
Pray for God to stop 1 girl from becoming a child-bride.
If you have the faith pray for 100 children rather than one.
Give Thanks. There is more to this prayer here

Please note: All my writings and comments appear in bold italics in this colour

Thursday, 23 April 2026

Wolves Among the Sheep > Christian singer - 160 charges of CSA/CSAM; Fla. Pastor gets 12 Life Sentences for CSA/CSAM

 

Christian singer charged with child sex abuse




A Tennessee musician has been charged with 160 counts related to alleged child sex abuse, including rape, according to court documents.

Phillip Vaught was arrested over the weekend following an extensive and lengthy investigation by Hendersonville police.

A Sumner County indictment lists 160 counts against Vaught, which include sexual abuse of a child, rape and exploitation.

The charges, according to local news outlet WECT, include five counts alleging sexual abuse of a child, including aggravated rape and soliciting sexual exploitation. A further 76 counts allege that Vaught unlawfully and knowingly used a minor in the production of material that includes the minor engaging in sexual activity. Several other counts relate to allegations of possession of child pornography, amongst other charges.

Court records indicate the alleged crimes date back to 2015 and continued as recently as January 2026.

Vaught, a Nashville-based country rock singer, has been described as a crossover artist with Christian and gospel roots. His father is a Baptist minister.

He is currently being held on a $1.5 million bond.





Former SBC Pastor Receives 3 Consecutive Life Sentences Following Conviction for Child Sex Abuse Material


Jonathan Elwing
, a former Southern Baptist pastor, has been convicted of multiple charges involving child sex abuse material (CSAM). He has been given 12 life sentences in total.  

Editor’s note: This article refers to disturbing reports of child sex abuse, which some readers might find triggering.

Elwing was serving as pastor of Palm View Baptist Church in Palmetto, Florida, at the time of his arrest in June 2024. The church is affiliated with the Southern Baptist Convention (SBC).

Elwing resigned from his role as pastor shortly before being taken into custody. 

Elwing was arrested after an investigation revealed that he used cryptocurrency to purchase CSAM. Law enforcement executed a search warrant at the church and Elwing’s home and found CSAM on Elwing’s phone. 


At the time, Larry Bianchi, the chair of Palm View’s deacon board, denounced Elwing’s crimes and emphasized that “the people of the church are the church.”

“The pastor may be the front man, he may be the leader of the church—and we need a new one,” Bianchi said. “But Palm View Baptist Church will go on because of the strength of the congregation.”

“Personally, I keep thinking it is a really bad dream, and I am going to wake up from it, but unfortunately, this happens in society,” said Bianchi. “It happens more often than not in places where children can be seen. There’s a lot of children in church.”

RELATED: Trial Date Set for Former SBC Pastor Charged With Producing and Possessing CSAM

Prosecutors later added 14 additional charges to Elwing’s case. These charges included six more counts of possession of CSAM, six counts of use of a child in a sexual performance, and two counts of sexual battery on a person less than 12 years old—which is a capital offense.


Sexual Abuse Awareness Month > The Consequences of Child Sexual Violence > INSPQ

 

April is Sexual Abuse Awareness month!

Child sexual abuse is far and away the worst kind of sexual abuse, yet it is very poorly known or understood. CSA is the worst atrocity mankind has ever inflicted upon itself, both in terms of numbers and the devastation that it causes in the lives of its victims. 

More children have been sexually abused in the 21st century than all the casualties (dead and wounded) in all the wars and all the genocides of the entire 20th century.

Here is a comprehensive list of the consequences of CSA, many of which I can personally vouch for.





Consequences of childhood and adolescent sexual violence


INSPQ  -   National Institute of Public Health of Quebec


Sexual violence in childhood and adolescence affects the well-being and development of victims, and contributes significantly to the overall burden of disease1. Some consequences may be immediate, while others may appear or persist into adulthood.

In Canada, in 2021, sexual violence before the age of 15, involving physical contact, contributed to 0.03% of deaths and 0.29% of years lived with disability (YLDs). Worldwide, in 2021, it ranked 48th among causes of death and 30th among risk factors having the greatest impact on years lived with disability2.

Sexual violence during childhood and adolescence can also occur at the same time as other negative experiences, such as other forms of violence experienced within the family or at school (e.g., exposure to domestic violence, physical abuse) or family difficulties (e.g., a parent's substance abuse problems). This combination of experiences can exacerbate the consequences3–5.

Consequences for mental health

  • Anxiety4,6–13
  • Behavioural disorders, impulsivity, and risk-taking behaviour 7,10,13,25
  • Body image dissatisfaction16
  • Depression4,6–11,13–15
  • Eating disorders, borderline or antisocial personality disorders, schizophrenia 7,9,11–13,24
  • Emotional, affective and psychological distress and difficulties (e.g., distress, despair, anger, attachment difficulties, identity disturbances)4,12,14–17
  • Feelings of shame and guilt7,16
  • Lower self-esteem19
  • Post-traumatic stress (e.g., dissociation, flashbacks, avoidance of risky situations, intrusive thoughts, nightmares, hypervigilance, irritability)4,7–12,14,16,23
  • Problem gambling21,22
  • Self-harm4,12,13
  • Suicidal ideation and behaviour4,7,10,12–14,20
  • Use of psychoactive substances, including abuse of or dependence on these substances (e.g., alcohol, cannabis, tobacco, cocaine, opioids, other drugs)4,7,10–14,18

Consequences for physical and sexual health 

  • Altered knowledge, perceptions and beliefs about sexuality (e.g., judgmental attitudes towards sexuality, disgust towards sexuality)16,26
  • Diseases (asthma, cancer, heart disease, etc.)3,13,33
  • Early or unwanted pregnancies (including in adolescence) and reproductive and obstetrical problems (e.g., miscarriage, distress during labour and delivery)7,13,16,27
  • Risky or compulsive sexual behaviours (e.g., sexual activities initiated earlier, multiple sexual partners, sex without a condom, injection drug use during sex, hypersexuality, inability to control sexual urges and behaviours)4,12,16,26,27
  • Sexual dysfunctions and difficulties (e.g., sexual dissatisfaction, difficulty achieving orgasm, problems with sexual desire or arousal, genital or sexual pain, erectile dysfunction, sexual distress, sexual aversion)16,28–31
  • Sexually transmitted and blood-borne infections4,7,13,32
  • Somatic problems (e.g., urinary incontinence during sleep in children, digestive problems, headaches or stomach-aches, chronic fatigue, irritable bowel syndrome, migraines, weight loss, insomnia)6,7,15

Relational, social, and economic consequences

  • Difficulties at school (e.g., dropping out of school, poorer academic performance) and at work (e.g., job insecurity, financial instability)4,5,34,35
  • Externalizing behaviour problems (e.g., aggressive behaviour, physical fighting, school suspension, delinquency)4,6,7
  • Loss of family ties and difficulty maintaining family ties9
  • Lower levels of education and income35
  • Relationship and parental difficulties (e.g., relationship or parental dissatisfaction, relationship conflicts, difficulty forming and maintaining intimate relationships)7,10,16,36,37
  • Social isolation6

The consequences listed above are presented in alphabetical order, regardless of frequency or severity.

References are available on INSPQ at:

References

==================================================================================

CSAM websites double in one year; EU's 'Battle' against CSAM; META's response to $375m fine; Welsh rape victims successfully sue police

 

Criminal gangs profiting as child sexual abuse websites double, experts say


Analyst who worked on Internet Watch Foundation report says content exists ‘across all social media platforms’ and is ‘very easy’ to find

The number of commercial child sexual abuse websites has doubled in a year as experts say that criminal gangs are making “huge profits” from online sexual exploitation.

According to data collected by the Internet Watch Foundation (IWF), 15,031 commercial child sexual abuse sites were found in 2025, compared with 7,028 found in 2024, a 114% increase.

An analyst who worked on the report but did not wish to be named said that this content exists “across all social media platforms” and is “very easy” to find.

“I can find child sexual abuse content, the worst categories, category A content, which is penetration of children as young as babies on any social media platform in as little as one search term and two clicks,” said the analyst.

“I think the public have this perception that this sort of material is hidden away in dark and dirty corners of the internet, but it’s not, it’s in plain sight.”

Kerry Smith, the chief executive of the IWF, said: “It is clear criminals are exploiting systemic failures and are finding it far too easy to reap huge profits from children’s sexual exploitation.

“We need mandatory measures on financial services to proactively detect, take down and report digital payment links for the sale of images and videos of child sexual abuse.

“We also need to see companies which use end-to-end encryption on their services adopt the tried and trusted safety tools which can prevent criminals using these platforms as safe havens to distribute child sexual abuse material,” Smith said.

The report found that the amount of child sexual abuse sites where users were directly paying for content has increased from 2% in 2024 to 5% in 2025. The analyst said that the cost could start from $12 (£8.90), up to $120 for the most extreme content.

Of these commercial sites, 16% were disguised so that illegal content could be accessed through a pathway that shows as legal content when loaded directly on to a browser. The most common payment method was cryptocurrency, while money transfer services and card payments were also used.

The analyst said that the money made from illegal content operated “like a pyramid scheme” through affiliate links.

“The video channel is profiting because of the traffic that’s going through. And then the person that’s posted the video will be profiting through all the clicks and the advertising through the affiliate schemes,” they continued.

Researchers also found instances of perpetrators trying to determine victims’ locations so they could be exposed to other criminal users.

The number of reports from young people under the age of 18 who have been the victim of sextortion – when a criminal threatens to publish nude or sexual imagery of a victim unless they comply with their demands – increased by 127% in 2025 compared with 2024. According to data collected from the Report Remove helpline, a free confidential service run by the IWF and the NSPCC, children as young as seven years old have self-reported sextortion.

Chris Sherwood, the CEO at the NSPCC, said: “The growing number of commercial child sexual abuse sites uncovered by the Internet Watch Foundation lays bare a severe problem, with malicious criminal gangs profiting off children’s pain.

“We know young victims of sexual exploitation are often left defenceless and can face re-traumatisation knowing images of themselves continue to circulate online. This form of abuse demands urgent action.

“Ofcom must use its powers and work with others to spot and disrupt these perpetrators at the source, before they impact more young lives. Equally, tech companies need to utilise existing technology that prevents children from taking, sharing, or receiving nude images.

“Childline’s Report Remove service is here for any young person under 18 who wants to speak to a professional and confidentially report sexual images and videos of themselves. Through the service, the IWF and Childline can help get these images removed and prevent them from being shared in the future,” Sherwood said.

 The NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331.

===================================================================================


What to know about the EU’s CSAM battle

As digital rules on detecting child abuse material expire and lawmakers clash over new powers, tech companies keep monitoring anyway, raising legal questions about how far the European Union will go.
Peder Schaefer

By Peder Schaefer

Peder Schaefer is a Brussels-based journalist.

23 Apr 2026

Privacy advocates argue that scanning private messages would amount to mass surveillance, undermining digital rights. Child safety groups, and many tech companies, say that without such tools, illicit content will spread rampantly.

That tension came to a head at the end of March, when the European Parliament voted against extending a temporary exemption that had allowed platforms to scan communications. 

Since the exemption expired on April 3, tech companies have been left in legal limbo: they’re obligated to remove illegal content, but the scanning methods they rely on may now violate EU privacy rules. Several companies have said they will continue scanning regardless, even without clear legal standing.

Now, those companies are urging the EU to move quickly on a permanent child safety law — one that would introduce centralized reporting and clearer obligations, and finally resolve years of legal uncertainty. 

What counts as CSAM, and how do companies remove it?

Child sexual abuse material is any content that shows a child being sexually abused or exploited according to Rainn, a sexual abuse advocacy group based in Washington D.C. CSAM is widely prevalent, with over 20.4 million cases reported in 2024 to the U.S. National Center for Missing and Exploited Children, an international clearinghouse for flagging CSAM material. In 2024, 70% of all such content globally originated in Europe, according to the Internet Watch Foundation.

Companies typically scan their platforms using a process called PhotoDNA, a form of hash matching that flags known CSAM by comparing digital fingerprints, without requiring a human to view the image. 

How has the EU tried to regulate CSAM scanning? 

In 2020, the Commission introduced a temporary measure allowing companies to scan for CSAM without breaching data privacy rules, but efforts to extend it failed in Parliament in March this year, with lawmakers rejecting a longer and broader exemption backed by the Commission and Council. 

In 2022, the EU executive put forward a permanent framework to clarify what scanning is allowed and to establish a dedicated centre to combat child sexual abuse. But the plan has stalled, as EU institutions remain divided over how far those scanning powers should extend. 

How did opposition to “chat control” gain momentum? 

In 2020, digital privacy groups such as Stop Chat Control and Fight Chat Control began campaigning against scanning, arguing that exemptions to privacy rules amounted to mass surveillance. Civil society groups organized citizens to send thousands of emails to European lawmakers.

“We allowed providers to check all personal communications of all users at any time,” said MEP Birgit Sippel (S&D, DE), the rapporteur on the temporary exemption. “As a general matter, that is deeply affecting the rights of all the users, and most of them are not criminals, clearly.”

Basic user privacy is incompatible with broad scanning mandates, according to Patrick Grady, an EU policy manager at the Chamber of Progress. He said the original temporary exemption, which had more narrow mandates, was the right approach. 

Where does the CSAM debate stand now?

When the exemption expired, tech companies accused the Parliament of failing to protect children. Yet firms including Snapchat, Meta, Google and Microsoft have continued to scan regardless, according to a joint letter.

Lawmakers say that’s illegal. “I’m frustrated by companies that say we decide which laws we respect, and which laws we want to ignore,” Sippel said.

Those same companies are now advocating for the EU institutions to fast-track the permanent regulation instead of reviving the rejected extension, according to Chloe Setter, Google’s head of child safety policy in Europe. Final trilogue talks are expected at the end of June, according to the Parliament’s press office.

Sign up to The Parliament's weekly newsletter

Every Friday our editorial team goes behind the headlines to offer insight and analysis on the key stories driving the EU agenda. Subscribe for free here.



Meta Verdict Poised to Shift Platform Liability and Sexual Abuse Litigation

 Thursday, April 23, 2026

On March 24, 2026, a New Mexico jury delivered a verdict that may well define a new era of sexual abuse litigation. After six weeks of testimony, the jury found Meta Platforms, Inc. liable for failing to protect children from sexual predators and misleading users about the safety of its platforms, ordering the company to pay $375 million in civil penalties. The verdict has been called historic, marking the first time a state has successfully sued Meta over child safety issues.

This article examines the legal theories and broader implications that make this ruling a potential watershed moment for clients who have been harmed by social media platforms, and the legal landscape surrounding platform liability more generally.

Consumer Protection, Not Publisher Liability

One of the most significant features of the New Mexico case is the legal framework under which it was brought. Rather than attempting to hold Meta liable as a publisher of user-generated content, which would implicate the broad immunity of Section 230 of the Communications Decency Act, the State of New Mexico framed its claims under the New Mexico Unfair Practices Act (UPA). The complaint explicitly stated that it did not seek to hold Meta liable as a publisher, but rather for Meta’s “deceptive, unfair, unconscionable, unreasonable, and unlawful conduct in designing and maintaining its products” and for “making deceptive statements concerning Meta’s conduct, platforms and policies.”

The UPA prohibits unfair or deceptive trade practices and unconscionable trade practices in the conduct of any trade or commerce. The State advanced four counts against Meta: (1) unfair or deceptive trade practices, (2) unfair trade practices, (3) unconscionable trade practices, and (4) public nuisance. Each violation of the UPA carried a maximum civil penalty of $5,000, and the jury found thousands of individual violations, resulting in the $375 million total penalty.

This approach effectively sidesteps the Section 230 defense that has long shielded technology companies from liability. By recasting platform misconduct as a consumer protection issue—rooted in misrepresentations about safety, the knowing deployment of addictive design features, and the concealment of internal research—this case offers a replicable playbook for attorneys general and private plaintiffs alike.

The Role of Algorithms: From Passive Hosting to Active Harm

Central to the jury’s findings was the role of Meta’s recommendation algorithms. New Mexico argued that Meta’s platforms did not merely host harmful content but actively “steered” young users toward sexually explicit material, child sexual abuse material (CSAM), and even sex trafficking through its recommendation systems. As the complaint alleged, Meta’s algorithms operated to “search and disseminate” sexually exploitative materials and to create social networks connecting users looking to buy and sell such content and the children victimized by it.

By establishing that the design and operation of algorithms can, itself, constitute an unfair or deceptive trade practice, this verdict reframes the debate around platform liability. Companies can no longer argue that they are mere conduits for user content when their own systems are actively curating and amplifying harmful material to vulnerable users.

Corporate Knowledge and Misrepresentation: A Key Liability Driver

The evidence at trial painted a stark picture of a company that publicly proclaimed its platforms were safe while internally documenting the opposite. The complaint cited years of internal documents and testimony demonstrating that Meta knew about the harms its platforms caused and chose not to act.

Court documents unsealed during the case included an internal email warning that there could be as many as 500,000 cases of online sexual exploitation per day on Facebook and Instagram. Yet Meta’s public-facing “prevalence” metrics consistently reported low percentages of offensive content, metrics that the company’s own internal study contradicted, showing users were “100 times more likely to tell Instagram they’d witnessed bullying in the last week than Meta’s bullying prevalence statistics indicated.”

This gap between public representation and internal knowledge, what New Mexico characterized as a pattern of misrepresentation, omission, and active concealment, formed the backbone of the consumer protection claims.

A New Era of Tech Accountability: Private Actions

The New Mexico verdict, combined with a contemporaneous Los Angeles jury verdict, signals what advocates are calling a new era for technology accountability. These verdicts may finally provide a path forward for individual victims who have long been stymied by Section 230 immunity.

A critical question is whether individual victims, not just state attorneys general, have a private right of action under consumer protection frameworks. Most state consumer protection statutes, including the New Mexico UPA, provide both public enforcement mechanisms (through the attorney general) and private rights of action for aggrieved consumers. Under NMSA 1978, § 57-12-10, any person who suffers actual damages as a result of conduct prohibited by the UPA may bring a civil action for treble damages or $100, whichever is greater, plus attorney fees and costs.

This suggests that children and families who have suffered harm from Meta’s platforms may be able to pursue their own claims using the same consumer protection framework that succeeded in the New Mexico state action. The New Mexico complaint itself noted that claims were brought under the UPA, which prohibits “unfair or deceptive trade practices and unconscionable trade practices in the conduct of any trade or commerce.” The jury’s findings that Meta’s conduct constituted deceptive, unfair, and unconscionable trade practices could provide significant support for private plaintiffs seeking to establish similar claims, where facts support that such prohibited practices proximately caused abuse of their child.

The UPA framework is particularly well-suited for private actions because it captures the full range of Meta’s alleged misconduct. The statute reaches conduct that “takes advantage of the lack of knowledge, ability, experience or capacity of a person to a grossly unfair degree”—language directly applicable to vulnerable minor users. Private plaintiffs can point to Meta’s own internal knowledge that its platforms took advantage of children’s developmental vulnerabilities and their “inability to self-regulate” as evidence satisfying this standard.

The Power of Per-Violation Penalties

The $375 million penalty, though substantial, was far below the $2 billion that New Mexico attorneys had originally sought. The total was reached because the jury found thousands of individual violations of the UPA, each carrying a maximum penalty of $5,000. The complaint also sought disgorgement of profits, injunctive relief, attorney fees, and pre- and post-judgment interest.

The per-violation penalty structure is particularly significant when projected onto future cases. Given the scale of Meta’s user base, with an average of 3.14 billion daily active users as of September 2023, the potential exposure in litigation involving millions of affected users is staggering. Any state with a comparable consumer protection statute could theoretically pursue similar claims, and the financial stakes could dwarf the New Mexico verdict.

The Broader Litigation Landscape

The New Mexico verdict does not exist in isolation. Meta was found to be liable in a separate case in Los Angeles, where a young woman claims she became addicted to platforms like Instagram and YouTube as a child because of how they are intentionally designed. That case focused on addiction-based harm rather than sexual exploitation, but it implicated the same corporate knowledge and design-choice theories.

Beyond sexual exploitation, the New Mexico case included a public nuisance claim alleging that Meta’s platforms contributed to increases in youth suicide, depression, eating disorders, bullying, and social media addiction. The Surgeon General of the United States issued an advisory in May 2023 warning that adolescents who spent more than three hours per day on social media faced double the risk of experiencing poor mental health outcomes.

A successful public nuisance theory could empower state governments to seek abatement orders requiring platforms to fundamentally redesign how they serve minor users, a far more consequential remedy than monetary damages alone. The complaint sought injunctive relief, abatement of the public nuisance, and payment of monies to the State to abate the nuisance, in addition to the UPA penalties.

Conclusion

The New Mexico verdict against Meta represents a potential turning point in the law of platform liability and sexual abuse litigation by demonstrating that technology companies can be held accountable for the foreseeable harms of their design choices. This case lays the groundwork for a new era of litigation on behalf of children and families harmed by social media platforms. With thousands of similar cases pending nationwide and Meta’s appeal still to come, this is an area of the law that will demand close attention from practitioners and clients alike in the months and years ahead.

HB Mobile Ad Slot