It's been obvious for many years now that any new app, platform, or program that comes out is mastered by perverts before anyone else, and consequently, they are there waiting for our children to arrive. South Korean police have just confirmed this.
Some 60% of deepfake sex crime victims in South Korea
are minors, police say
SEOUL, Aug. 30 (UPI) -- As South Korea faces a sprawling deepfake sex crime scandal, police data released Friday revealed that nearly six out of 10 victims were minors.
Local media reports detailing the spread of fake, AI-generated pornographic images of ordinary women and girls, hosted in chat rooms on the Telegram messaging app, have sent shockwaves across the country in recent days.
Many of the chat rooms -- one of which was found to have over 220,000 members by South Korean newspaper Hankyoreh -- are based at universities and high schools, and target students and teachers.
The crime has turned out to be alarmingly widespread. A survey of students, teachers and staff at K-12 schools by the Korean Teachers and Education Workers Union released Thursday found that more than 20% of respondents said that they were either direct or indirect victims of deepfake pornography.
The number of reported deepfake cases has soared this year, from 156 in 2021 to 297 cases as of July this year, according to the National Police Agency, which announced an intensive effort to track and arrest deepfake creators and distributors on Tuesday.
According to data submitted Friday by the police agency to Rep. Yang Boo-nam of the Democratic Party of Korea, 315 out of the 527 victims -- or 59.8% -- in reported cases between 2021 and 2023 were teenagers.
South Korean President Yoon Suk Yeol called on authorities this week to step up efforts to combat the crimes.
"Deepfake videos may be dismissed as mere pranks, but they are clearly criminal acts that exploit technology under the shield of anonymity," Yoon said during a Cabinet meeting Tuesday. "Anyone can be a victim of such digital sex crimes."
The scandal has emerged as Telegram's billionaire founder and CEO Pavel Durov was arrested and indicted this week by French authorities for allegedly permitting criminal activity on the app, including drug trafficking, child sexual content and fraud.
The Korea Communications Standards Commission, the country's media regulator, announced this week that it plans to set up a round-the-clock hotline with Telegram to monitor and delete deepfake videos and has asked French authorities for regular cooperation.
Women's rights groups, however, have criticized the lack of government response to digital sex crimes that have been rampant for years in South Korea.
In 2019, another Telegram-based digital sex scandal drew widespread outrage. The so-called "Nth room" case involved criminals blackmailing girls into sharing sex videos, with many victims being underage.
The outcry over that case followed mass protests in 2018 against a spycam epidemic in South Korea, in which secret videos taken from public restrooms, changing rooms, subways and buses were widely sold and shared online. And 2019's Burning Sun scandal, in which K-pop stars and other powerful figures were implicated in charges of rape and hidden sex videos, further inflamed public sentiment.
"In a society that does not properly punish or prevent crimes and violence against women, we are literally living without a state, without a sense of safety in our daily lives," Korean women's rights group Womenlink said in a statement Monday.
"With the advancement of technology, the patterns of crime are diversifying," the statement said. "Can a society survive in which the daily safety of millions of its members is threatened? This is a state of national emergency."
Heather Barr, associate director of the Women's Rights Division at Human Rights Watch, said that digital sex crimes in South Korea are causing lingering trauma for victims, even driving some to suicide.
TikTok faces lawsuit over ‘blackout challenge’ death of 10-year-old girl
By Maryclaire Dale
2 min read
A U.S. appeals court revived on Tuesday a lawsuit filed by the mother of a 10-year-old Pennsylvania girl who died attempting a viral challenge she allegedly saw on TikTok that dared people to choke themselves until they lost consciousness.
While federal law generally protects online publishers from liability for content posted by others, the court said TikTok could potentially be found liable for promoting the content or using an algorithm to steer it to children.
“TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech,” Judge Patty Shwartz of the 3rd U.S. Circuit Court in Philadelphia wrote in the opinion issued Tuesday.
Lawyers for TikTok’s parent company, ByteDance, did not immediately return phone and email messages seeking comment.
Lawyers for the mother, Tawainna Anderson, had argued that the so-called “blackout challenge,” which was popular in 2021, appeared on Nylah Anderson’s “For You” feed after TikTok determined that she might watch it — even after other children had died trying it.
Nylah Anderson’s mother found her unresponsive in the closet of their home in Chester, near Philadelphia, and tried to resuscitate her. The girl, described by her family as a fun-loving “butterfly,” died five days later.
“I cannot stop replaying that day in my head,” her mother said at a news conference in 2022, when she filed the lawsuit. “It is time that these dangerous challenges come to an end so that other families don’t experience the heartbreak we live every day.”
A district judge initially dismissed the lawsuit, citing Section 230 of the 1996 Communications Decency Act, which is often used to protect internet companies from liability for things posted on their sites.
The three-judge appeals court panel partially reversed that decision Tuesday, sending the case back to the lower court for trial.
“Nylah, still in the first year of her adolescence, likely had no idea what she was doing or that following along with the images on her screen would kill her. But TikTok knew that Nylah would watch because the company’s customized algorithm placed the videos on her ‘For You Page,’” Judge Paul Matey wrote in a partial concurrence to the opinion.
Jeffrey Goodman, a lawyer for the family, said it’s “inevitable” that courts give Section 230 more scrutiny as technology reaches into all facets of our lives. He said the family hopes the ruling will help protect others, even if it doesn’t bring Nylah Anderson back.
“Today’s opinion is the clearest statement to date that Section 230 does not provide this catchall protection that the social media companies have been claiming it does,” Goodman said.
No comments:
Post a Comment