3/21/2024 0 Comments Telegram porn channel![]() 404 Media and Ars were not able to replicate outputs based on recommendations in the Telegram group. It's possible that Microsoft has already updated the tool to stop users from abusing Designer. ![]() Members of the group shared strategies for subverting these safeguards by avoiding prompts using "Taylor Swift" and instead using keywords like "Taylor 'singer' Swift." They were then able to generate sexualized images by using keywords describing "objects, colors, and compositions that clearly look like sexual acts," rather than attempting to use sexual terms, 404 Media reported. While it's still unknown how many AI tools were used to generate the flood of harmful images, 404 Media confirmed that some members of the Telegram group used Microsoft's free text-to-image AI generator, Designer.Īccording to 404 Media, the images were not created by training an AI model on Taylor Swift's images but by hacking tools like Designer to override safeguards designed to stop tools from generating images of celebrities. These images began circulating online this week, quickly sparking mass outrage that may finally force a mainstream reckoning with harms caused by the spread of non-consensual deepfake pornography.Īt least one member of the Telegram group claimed to be the source of some of the Swift images, posting in the channel that they didn't know if they "should feel flattered or upset that some of these Twitter stolen pics are my gen." Axelle/Bauer-Griffin / Contributor | FilmMagic reader comments 430įake AI images sexualizing Taylor Swift spread to X, formerly known as Twitter, from a Telegram group dedicated to sharing "abusive images of women," 404 Media reported.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |