TechieTricks.com
Explicit AI deepfakes of Taylor Swift cause outrage Explicit AI deepfakes of Taylor Swift cause outrage
A series of explicit AI deepfake images of Taylor Swift doing the rounds on social media has caused outrage amongst fans and lawmakers alike,... Explicit AI deepfakes of Taylor Swift cause outrage


A series of explicit AI deepfake images of Taylor Swift doing the rounds on social media has caused outrage amongst fans and lawmakers alike, as reported by VentureBeat.

The images show the 2023 Times Person Of The Year engaging in explicit sexual activity with fans of NFL team the Kansas City Chiefs, which is the team her boyfriend Travis Kelce plays for.

An army of Swift fans has rushed to her defense on social media with the hashtag #ProtectTaylorSwift, while X battled to block the content due to new accounts regularly reposting the images. Meanwhile, US lawmakers are now under renewed pressure to crack down on the rapidly evolving generative AI marketplace.

It’s not clear what AI image-generation tools were used to create these specific deepfakes of Taylor Swift. Many services, including MidJourney and OpenAI’s DALL-E3, prohibit the creation of sexually explicit or suggestive content.

However, 404 Media, which says tracked the images down to a group on Telegram, claims the images were created using Microsoft’s AI tools, which is powered by DALL-E3.

X account @Zvbear has admitted to creating some of the images, according to Newsweek, and they have since turned their account to private.

What can lawmakers do to crack down on deepfake content creation?

As the Daily Mail reports of Taylor Swift’s fury over these specific images being spread on social media, US lawmakers are under pressure to regulate the technology behind them.

Tom Kean Jr, a Republican Congressman from the state of New Jersey, released a statement to the press this week that urges Congress to take up and pass two bills he has introduced to help regulate AI.

In this statement, Kean says: “Whether the victim is Taylor Swift of any young person across our country, we need to establish safeguards to combat this alarming trend.

“My bill, the AI Labelling Act, would be a very significant step forward.”

The AI Labeling Act would require AI multimedia generator companies to add a “clear and conspicuous notice” to their generated works that is is “AI-generated content.” It’s clear, though, how that would prevent the creation of the images in the first place.

Meta is already doing something similar for images generated using its Image AI art generator tool, while OpenAI recently promised to implement AI image credentials.

Featured Image: Photo by Rosa Rafael on Unsplash

James Jones

Freelance Journalist

James Jones is a highly experienced journalist, podcaster and digital publishing specialist, who has been creating content in a variety of forms for online publications in the sports and tech industry for over 10 years.

He has worked at some of the leading online publishers in the country, most recently as the Content Lead for Snack Media’s expansive of portfolio of websites, including Football Fancast.com, FootballLeagueWorld.co.uk and GiveMeSport.com. James has also appeared on several national and global media outlets, including BBC News, talkSPORT, LBC Radio, 5 Live Radio, TNT Sports, GB News and BBC’s Match of the Day 2.

James has a degree in Journalism and previously held the position of Editor-in-Chief at FootballFanCast.com. Now, he co-hosts the popular We Are West Ham Podcast, writes a weekly column for BBC Sport and covers the latest news in the industry for ReadWrite.com.





Source link

techietr