As the tech and government worlds try to respond, the Taylor Swift universe is seething over deepfake images of the pop singer that emerged on social media last week.
By NBC’s count, images of Swift in graphic poses were viewed more than 27 million times after first being posted to X.
“Whether or not legal action will be taken is being decided but there is one thing that is clear: these fake AI-generated images are abusive, offensive, exploitative, and done without Taylor’s consent and/or knowledge,” the Daily Mail quoted a source close to Swift, who has been in the spotlight for dating Kansas City Chiefs tight end Travis Kelce, as saying.
“It is shocking that the social media platform even let them be up to begin with. These images must be removed from everywhere they exist and should not be promoted by anyone,” the source said.
“Taylor’s circle of family and friends are furious, as are her fans obviously. They have the right to be, and every woman should be. The door needs to be shut on this. Legislation needs to be passed to prevent this and laws must be enacted,” the source said.
those taylor ai pics is so disgusting and whoever started this should be in jail! that’s literally sexual abuse, protect taylor swift
— segryl ✰ (@ryldiaries) January 26, 2024
Microsoft CEO Satya Nadella said the tech community must respond, according to NBC.
“Yes, we have to act,” he said.
“I think we all benefit when the online world is a safe world. And so I don’t think anyone would want an online world that is completely not safe for both content creators and content consumers. So therefore, I think it behooves us to move fast on this,” Nadella said.
As of Saturday, X had adapted its search function to make searches for Taylor Swift come up empty, although other combinations of her name produced normal searches, according to CNN.
“This is a temporary action and done with an abundance of caution as we prioritize safety on this issue,” a representative of X said.
What happened to Taylor Swift is appalling. I’m glad more people are now aware of the ways in which AI can be used to create horrific forms of image-based sexual abuse. It’s been going on for years – I’ve been researching and writing about it since 2018. Welcome to the nightmare. https://t.co/3Ct0V0MIjd
— Kelsey Farish (@KelseyFarish) January 28, 2024
Angry Swifites calling for action got the attention of the Biden White House.
“We are alarmed by the reports of the…circulation of images that you just laid out – of false images to be more exact, and it is alarming,” White House press secretary Karine Jean-Pierre said, according to ABC.
“While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non-consensual, intimate imagery of real people,” she said.
Democratic Rep. Joe Morelle of New York is hoping the episode gives life to his bill to make sharing digitally altered explicit images without a person’s consent a federal crime.
“We’re certainly hopeful the Taylor Swift news will help spark momentum and grow support for our bill, which as you know, would address her exact situation with both criminal and civil penalties,” a representative for Morelle said.
This article appeared originally on The Western Journal.