Mom Demands AI Regulation After Daughter Was Victim Of Deep Fake Porn

Aus Wake Wiki
Zur Navigation springen Zur Suche springen

A concerned mother whose 14-year-old daughter was a target of fake nude photos, just like Taylor Swift, is teaming up with lawmakers from across the political spectrum to demand Congress enact AI regulation. 

'It is clear that AI technology is advancing faster than the necessary guardrails,' said New Jersey Republican Rep. Tom Kean. 

'Whether the victim is Taylor Swift or any young person across our country - we need to establish safeguards to combat this alarming trend,' he said as he called for Congress to pass his AI regulation bill. 

Similarly, adult xxx movies free in November male students at Westfield High School in Kean's New Jersey district were caught with explicit AI-generated images of their female classmates. 

According to Francesca Mani, 14, who was a victim of the false images, and her mother Dorota, the boy involved were back at school after a two-day suspension. 

They filed a complaint with the police - but without laws on the books to regulate AI-generated art, the complaint was quickly closed. 

'There was very little to no accountability,' Dorota Mani told DailyMail.com. 'There's no AI legislation - the school did not update its AI cyber harassment policies, they were just like a broken record, 'counseling is available.' That's helping nothing.' 




A slew of viral fake images purporting to be nude photos of Taylor Swift has prompted fresh calls from Congress to enact AI regulation





Francesca Mani, 14, (right) with her mom Dorota (left). Mani, 14, who attends Westfield High School said the presence of the boy who made fake explicit images of her makes her 'very uncomfortable and scared' after she found out fake nude images of herself had been disseminated to students in her grade on Snapchat 

'If that's not negligence, I really don't know what it is,' she said. The mother-daughter duo have become a public face of AI regulation advocacy and will head to the White House in February to plead the case. 

'Everybody is asking how we are feeling. Nobody's asking, how are the boys feeling with such little accountability?' the mother went on. 

'You know, how are the platforms that are connecting those transactions, like AMEX and PayPal and Chase and Visa, are they sitting up night knowing that they're allowing for those transactions? Nobody's asking how are Google and Amazon and Microsoft sleeping at night and how are they feeling knowing that they are hosting the content?'

Meanwhile Taylor Swift is the latest target of website Celeb Jihad, that flouts state porn laws and continues to outrun cybercrime squads.

Celeb Jihad is believed to be the origin of dozens of recent graphic images which depict Swift at a Kansas City Chiefs game. The images, which were created using AI software, were shared by the website on January 15 under a headline titled: ' Taylor Swift Chiefs Sex Scandal Caught On Camera'.

Swift has been a regular at Chiefs games since going public with her romance with star player Travis Kelce.

Nonconsensual deepfake pornography is illegal in Texas, Minnesota, New York, Virginia, Hawaii and Georgia. In Illinois and California, victims can sue the creators of the pornography in court for defamation. But it is not banned in federal law. 

'Intimate deepfake images like those targeting Taylor Swift are disturbing, and sadly, they're becoming more and more pervasive across the internet. I'm appalled this type of sexual exploitation isn't a federal crime,' Rep. Joe Morelle, D-N.Y., said in a statement to DailyMail. If you have any type of questions regarding where and ways to use adult xxx movies free, you can call us at our site. com. 

More than 90% of such false imagery known as 'deep fakes' are porn, according to image-detection firm Sensity AI. 











A bipartisan pair of lawmakers is pushing their bill to curb 'deepfake' AI-generated images after fake explicit images of Taylor Swift went viral 

Earlier this week a wave of fake robocalls mimicking President Joe Biden in the New Hampshire primary advised voters not to vote and told them it is 'important that you save your vote for the November election.' 

But the Westfield High incident prompted area lawmakers to demand banning the making of such images. 

'Try to imagine the horror of receiving intimate images looking exactly like you—or your daughter, or your wife, or your sister—and you can't prove it's not,' Morelle said 'Deepfake pornography is sexual exploitation, it's abusive, and I'm astounded it is not already a federal crime.'

Kean and Morelle have launched a bipartisan bill that would require AI generators to 'prominently display disclosure to clearly identify content' that is generated by AI and would establish working groups to find best practices to identify and disclose AI-generated content. 

The Swift scandal is the latest of many involving Celeb Jihad, which was created by its anonymous founder in 2008.

Astonishingly, the site claims its content is 'satire' and states it is 'not a pornographic website'.

Swift's legal team previously issued a warning to Celeb Jihad in 2011 after it published a faked photo which depicted the singer topless. The picture appeared with the caption 'Taylor Swift Topless Private Pic Leaked?'

At the time, her lawyers threatened to file a trademark-infringement suit and accused it of spreading 'false pornographic images' and 'false news'. The website appears to have published hundreds, and potentially thousands, more faked images of Swift since it was started.

Celeb Jihad was also involved in several massive leaks of private photos hacked from celebrities' cellphones, including iCloud accounts, in 2017.

The website was one of several which published illicitly-obtained photos of celebrities including Miley Cyrus and Tiger Woods and his former girlfriend Lindsey Vonn. Vonn said the leak was an 'outrageous and despicable invasion of privacy.'


PoliticsNew JerseyTaylor SwiftTravis Kelce