Lewd Taylor Swift AI Images Likely Originated In A Telegram Chat Group

Aus Wake Wiki
Zur Navigation springen Zur Suche springen

Sexually explicit images of Taylor Swift generated by Artificial Intelligence emerged from a Telegram group before they were shared millions of times around the world, analysts believe. 

Swift, 34, was said to be deeply distressed by the images, and members of Congress have renewed their calls for the criminalization of sharing of pornographic, nonconsensual deepfakes.

The images were first spotted on Wednesday, and spread rapidly, receiving 45 million views and 24,000 reposts on X before they were removed, 19 hours later, the Verge reported.

On Thursday, tech website 404 Media discovered that the images originated in a Telegram group, which was dedicated to making non-consensual AI generated sexual images of women.




AI-generated explicit images of Taylor Swift were posted on the Celeb Jihad website, which was previously warned by the singer's lawyers after it shared another faked image in 2011. Pictured: Swift performs during the Eras Tour in Sao Paulo, Brazil, on November 24, 2023





The obscene images are themed around Swift's fandom of the Kansas City Chiefs, which began after she started dating star player Travis Kelce





The images were generated by members of a Telegram chat group, using Microsoft programs and sharing workarounds to skirt Microsoft's rules 

Members of the group were annoyed at the attention the Swift images were drawing to their work, 404 Media reported.

'I don't know if I should feel flattered or upset that some of these twitter stolen pics are my gen,' one user in the Telegram group said, according to the site.

Another complained: 'Which one of you mfs is grabbing s*** here and throwing it on Twitter?'

A third replied: 'Well if there was any way to get this s*** shut down and raided it's idiots like that.'

The images were not classic 'deepfakes', 404 Media reported, with Swift's face superimposed onto someone else's body.

Instead, they were entirely created by AI - with members of the group recommending Microsoft's AI image generator, Designer.

Microsoft will not permit users to generate an image of a person, by putting the commands 'Taylor Swift'.

But users of the Telegram group would suggest workarounds, prompting Designer to create images of 'Taylor 'singer' Swift.'




Swift pictured leaving Nobu restaurant after dining with Brittany Mahomes, wife of Kansas City Chiefs quarterback Patrick Mahomes, on January 23

And instead of instructing the program to create sexual poses, the users would input prompts with enough objects, adult free movies xxx colors and compositions to create the desired effect.

The Pennsylvania-born billionaire was among the first people to fall victim to deepfake porn, in 2017.

The news site reported she was also one of the first targets of DeepNude, which generated naked images from a single photo: the app has now been taken down.

Swift's unpleasant situation has renewed a push among politicians for tighter laws.

Joe Morelle, a Democrat member of the House representing New York, introduced a bill in May 2023 that would criminalize nonconsensual sexually explicit deepfakes at the federal level.

On Thursday, he wrote on X: 'Yet another example of the destruction deepfakes cause.'




One website called 'DeepNude' boasts that you can: 'See any girl clothless [sic] with the click of a button.' A 'standard' package on 'DeepNude' allows you to generate 100 images per month for $29.95, while $99 will get you a 'premium' of 420.

His fellow New Yorker, Representative Yvette Clarke, also called for action.

'What's happened to Taylor Swift is nothing new. If you liked this information and you would such as to get more information concerning Adult Xxx movies free kindly see our webpage. For yrs, women have been targets of deepfakes w/o their consent. And w/ advancements in AI, creating deepfakes is easier & cheaper,' she said.

'This is an issue both sides of the aisle & even Swifties should be able to come together to solve.'

Swift has not commented on the incident, and neither has X owner Elon Musk.

As of April 2023, the platform's manipulated media policy prohibits media 'depicting a real person [that has] been fabricated or simulated, especially through use of artificial intelligence algorithms.'

Images have frequently slipped through the net, however, and the situation worsened when Musk took over in October 2022 and gutted the moderation team.


Taylor SwiftTwitterAIUS Congress