Thursday , July 7 2022

Code of buyback of deep codes, as Discord prohibits Internet users who have sought a vile AI application • The Registry


GitHub has deleted a deposit that contains partial plans of DeepNude, the well-known AI-powered application that pulled women's clothing into photos to create fake photos.

These fragments of the decompiled source code of Python revealed the internal operation of the software and were generated from a copy of the DeepNude application that was distributed and sold briefly a couple of weeks ago. The material was placed in a GitHub repository now eliminated, apparently to encourage others to build new variants of DeepNude using, as well as models of neuronal networks of the application, the algorithms decompiled as a guide.

And it was not the only one that withdrew from GitHub to contain the source code based on DeepNude.

These details come after Discord, a popular multi-user chat system used by players and geeks, bans several server instances that came or shared the DeepNude software. It seems that Discord also banned users who searched for the application on their chat servers or who helped distribute it.

A Discord account used by Register – For journalistic, honest reasons – investigating the spread of broken versions of payment software ended Monday night this week.

"Discord focuses on maintaining a safe and secure environment for our community and the Discord community has flagged your account for violations of our community Use and Guidelines", Discord's email, He informed us that our account was toasted and read. .

"Our team reviewed the claim and acted by deactivating your account. Your account directly sent non-consensual pornography or was involved in servers that were dedicated to this content."

The Reg He has asked Discord for more details about this repression. The chat system business told us earlier that I was investigating the terms of service violations communicated by a server or a user.

"Non-consensual pornography guarantees instant server closure and prohibition of users as long as we identify it," he warned. The images generated by DeepNude of women who do not wear clothes, imagined digitally with their dressed selfies, are a way of not consensing pornography.


The rage behind DeepNude began last month when it was discovered that a group of sassy developers had built a desktop application for Windows and Linux and a variant of Android that allowed anyone to stay With the tool to see what the woman may look like. Free and paid versions are available for download.

No deep knowledge of the application to use the application has been required: just feed it with a photo of an uncoated woman, and automatically spit the same image with the clothes substituted for what can be left for under, or better. . Sometimes it works well, depending on lighting and framing, being trained in thousands of pornographic pornography of women in various states of stripping.

When this monstrous mischief was viral and Internet exploded with rage, its creators quickly eliminated the application of their website, asked people to stop using it and copied it and tried to pretend it never it happened.

A woman next to a window

DeepNude deeply concerned: photographic application of the AI ​​removed the women's clothes to make them knot. Now he is stripped of the web


But it was too late: the premium creation of the software was broken and it was sold back or distributed through file sharing networks and through Discord servers. The computer hubs also went to the reverse engineering of the code and, as we said, they published the algorithms and the source decompiled in places like GitHub for other pervert.

Finally, someone reported the repositories as inadequate in GitHub, which quickly removed them for being "sexually obscene." A spokesman for GitHub said Register in an email last night: "We do not accept using GitHub to publish sexually obscene content and prohibit this behavior in our terms of service and community guidelines."

Microsoft's GitHub acceptable use policies indicate: "Users will not, in any way, upload, publish, host or transmit any content to any of the deposits that contain obscene content or content."

The open source bedroom says that its staff does not manually route the projects hosted by GitHub for DeepNude's implementations, and instead rely on developers to report any unpleasant things detected.

"We do not proactively monitor the content generated by users, but we actively investigate abuse reports. In this case, we disable the project because we found that it violated our acceptable usage policy," said a GitHub spokesman.

Register At one point, several deposits related to DeepNude have been found at GitHub, so it can be difficult to eliminate them all, especially if they continue to appear annoying.

"We do not reveal the number of reports per project," added the spokesman. ®

Equilibrium of consumerism and corporate control

Source link