GitHub’s Deepfake Porn Crackdown Still Isn’t Working

Estimated read time 4 min read

“When we look at intimate image abuse, the vast majority of tools and weaponized use have come from the open source space,” says Ajder. But they often start with well-meaning developers, he says. “Someone creates something they think is interesting or cool and someone with bad intentions recognizes its malicious potential and weaponizes it.”

Some, like the repository disabled in August, have purpose-built communities around them for explicit uses. The model positioned itself as a tool for deepfake porn, claims Ajder, becoming a “funnel” for abuse, which predominantly targets women.

Other videos uploaded to the porn-streaming site by an account crediting AI models downloaded from GitHub featured the faces of popular deepfake targets, celebrities Emma Watson, Taylor Swift, and Anya Taylor-Joy, as well as other less famous but very much real women, superimposed into sexual situations.

The creators freely described the tools they used, including two scrubbed by GitHub but whose code survives in other existing repositories.

Perpetrators on the prowl for deepfakes congregate in many places online, including in covert community forums on Discord and in plain sight on Reddit, compounding deepfake prevention attempts. One Redditor offered their services using the archived repository’s software on September 29. “Could someone do my cousin,” another asked.

Torrents of the main repository banned by GitHub in August are also available in other corners of the web, showing how difficult it is to police open-source deepfake software across the board. Other deepfake porn tools, such as the app DeepNude, have been similarly taken down before new versions popped up.

“There’s so many models, so many different forks in the models, so many different versions, it can be difficult to track down all of them,” says Elizabeth Seger, director of digital policy at cross-party UK think tank Demos. “Once a model is made open source publicly available for download, there’s no way to do a public rollback of that,” she adds.

One deepfake porn creator with 13 manipulated explicit videos of female celebrities credited one prominent GitHub repository marketed as a “NSFW” version of another project encouraging responsible use and explicitly asking users not to use it for nudity. “Learning all available Face Swap AI from GitHUB, not using online services,” their profile on the tube site says, brazenly.

GitHub had already disabled this NSFW version when WIRED identified the deepfake videos. But other repositories branded as “unlocked” versions of the model were available on the platform on January 10, including one with 2,500 “stars.”

“It is technically true that once [a model is] out there it can’t be reversed. But we can still make it harder for people to access,” says Seger.

If left unchecked, she adds, the potential for harm of deepfake “porn” is not just psychological. Its knock-on effects include intimidation and manipulation of women, minorities, and politicians, as has been seen with political deepfakes affecting female politicians globally.

But it’s not too late to get the problem under control, and platforms like GitHub have options, says Seger, including intervening at the point of upload. “If you put a model on GitHub and GitHub said no, and all hosting platforms said no, for a normal person it becomes harder to get that model.”

Reining in deepfake porn made with open source models also relies on policymakers, tech companies, developers and, of course, creators of abusive content themselves.

At least 30 US states also have some legislation addressing deepfake porn, including bans, according to nonprofit Public Citizen’s legislation tracker, though definitions and policies are disparate, and some laws cover only minors. Deepfake creators in the UK will also soon feel the force of the law after the government announced criminalizing the creation of sexually explicit deepfakes, as well as the sharing of them, on January 7.

Source link

You May Also Like

More From Author

+ There are no comments

Add yours