Skip to main content
We may receive compensation from affiliate partners for some links on this site. Read our full Disclosure here.

White House Wants To Criminalize Sharing Fake Nude Taylor Swift And Similar Images


At least the White House is finally getting their priorities in order.

You know, something seems off about this.

Constant degenerate content is allowed and promoted by liberals and Democrats.

So what gives?

This sure sounds like a setup to me.

ADVERTISEMENT

You see, earlier this week a crude AI image of Taylor Swift went viral on social Media X.

Which isn’t new, people have been making fake AI versions of everyone since it came out.

Only this time the White House press secretary Karine Jean-Pierre chimes in on it.

That’s odd.

What do they care about this topic?

Shouldn’t they be concerned about the border, inflation and a hundred other things?

Turns out on she said on Friday that Congress should pass a bill criminalizing the sharing of similar images.

Similar images, eh?

Hmm.. Seems like a bold suggestion.

Are they trying to get a head of something being released?

Incriminating images, perhaps?

Are they using uber-famous Taylor Swift to get the masses behind there new bill? (Which I bet is already written up.)

Once passed, they can have the press say certain images are AI and fake.

Then they can arrest anyone who posts them, even if they’re real.

Curious that SAG-AFTRA also chimmed in, but first here’s Karine Jean-Pierre:

The Washington Examiner reports:

White House press secretary Karine Jean-Pierre told reporters Friday it is alarming that fake, sexually explicit images of Taylor Swift spread across social media earlier this week and that Congress should pass a bill criminalizing the sharing of similar images.

Explicit images of the pop star, believed to be generated by artificial intelligence, spread like wildfire across X and other social media sites. One image drew more than 47 million views before the account that posted it was suspended, and the incident reignited calls for new legislation on the subject.

“We are alarmed by reports of the circulation of images that you just laid out — false images, to be more exact,” Jean-Pierre answered when asked if the president would support such legislation. “Social media companies make their own independent decisions about content management, but we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation and nonconsensual, intimate imagery of real people. Sadly, though, too often, we know that lack of enforcement disproportionately impacts women, and they also impact girls.”

Yeah.

We totally believe you that you have women and girls’ best interest at heart.

It’s a good thing they also want to criminalize sharing actual crude images of women on websites that children can access, like X.

Oh, wait. They don’t.

Rolling Stones added:

SAG-AFTRA deplored the AI-generated graphic images of Taylor Swift that went viral on X (formerly Twitter) this week, calling the content “upsetting, harmful, and deeply concerning” in a statement issued on Friday.

“The development and dissemination of fake images – especially those of a lewd nature – without someone’s consent must be made illegal,” the union said, while also calling support to Congressman Joe Morelle’s Preventing Deepfakes of Intimate Images Act to combat the practice. “As a society, we have it in our power to control these technologies, but we must act now before it is too late. We support Taylor, and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy.”

Obviously we are against this sort of crude content being shared.

It’s just that I don’t buy the reasoning behind what they’re saying.

As the phrase goes, “I’m not buying your soap, lady!”

Here’s the full press briefing:



 

Join the conversation!

Please share your thoughts about this article below. We value your opinions, and would love to see you add to the discussion!

Leave a comment
Thanks for sharing!