Creating fake nudes: the damage caused by new ‘nudifying’ tech

Hannah Mercer
4 min readAug 11, 2021

With the unveiling of the new nudifying tool created by Deepsukebe- a website which has had over 5 million hits in the month of June 2021- it comes as another weapon to be used to create nude images, particularly those of women, using artificial intelligence with the highly likely possibility that they will spread rapidly on social media.

The website which launched in 2020 and does not have a known founder, states that it can “reveal the truth hidden under clothes” and celebrities including an Olympic athlete have come forward to say that they have been victims of this technology, with others believing that they are real nude images. This could have a devastating effect on not only people’s careers but also their mental health. The upload of nude images without a person’s permission, or as a form of revenge porn is wrong and we strongly believe that more action needs to be taken in order to make sure that the victims of this type of abuse are given some of the justice and protection they deserve. Additionally, making someone nude from an image that they are clothed in seems to go well beyond our human rights, especially with how convincing they can be.

Despite many of those who purchased the 99$ premium version of the Deepsukebe app being refunded due to the backlash received, many apps are still out on the market who do similar, after the technology developed by the app was made publicly available by their original developers. Although many of the re-productions are not made with great quality, Deepsukebe’s new website which uses a proprietary algorithm- means that it is a better quality than those created by their competitors.

Deepsukebe’s Twitter page expresses their mission to “make all men’s dreams come true”, even stating in a blog post that they are working on creating a more powerful and accurate version of the app to improve it for their users. It is no surprise that this app has caused outrage. MPs included asking for a ban on the tool and legislation to be introduced to make sure that no other versions and that of the like are developed in the future. MP Maria Miller has discussed the need for the government to change the law surrounding making nude images, to ensure that this act is seen as an offence so those who are affected by this are able to pursue actions to protect themselves.

If developers can make this type of technology available, then they should be held accountable for their actions. There is also a push from the campaign group Cease (Centre to End All Sexual Explotation) to make sure that app’s like Deepsukebe and the technology that they have produced is seen as online sexual abuse and therefore should be included in the Online Safety Bill. MP Miller has been campaigning alongside Cease, for the sharing of nude and sexually explicit content without consent- now more commonly known as revenge porn- to be included in legislation and now there is hope that it can be included in the forthcoming Online Safety Bill.

The effects of this tool, along with the other companies who are implementing the technology created by Deepsukebe ultimately have to be held accountable for their actions, as they are contributing to the amount of abuse available online. This technology is making those who may not have even sent nudes before victims of revenge porn, highlighting how dangerous this tech can be.

There are many cases of celebrities who were hacked by exploitation of cloud technology and the sharing of nude, private images went viral. The impact of the actions of others sharing this content can frankly ruin careers, relationships and ultimately lives which is frightening now that there is public access to nudity creation tools, and the illegal action of hacking into image storage is no longer even needed, and Deepsukebe removes not only clothes, but also dignity and humility.

Often with revenge porn, the problems, other than there not being enough legislation surrounding this subject, is the blame placed on victims. ‘You sent nudes so what do you expect?’ is a common phrase. The reality is that many who have shared nudes have done so in what they thought was a confidential manner and of course they have no intention in having it shared publicly, but that’s what makes this tool and this type of technology even worse. The use of this AI means that those who have not even sent nudes can be targeted and made victims, and who is to say that they will even be believed. There is the potential that this could lead to cases of both domestic abuse, as well as sexual abuse, if things escalate or victims are not trusted by others that they have relationships with.

One of our central missions at DragonflAI is to protect children online, and while we have not discussed the potential for AI generated child abuse imagery through these deepfake tools, it is a very real and terrifying prospect that needs to be mentioned, and will certainly be discussed in further detail in a later post. But for now, it’s fair to say that this technology is seen by most as exploitative and harmful, but for the men (and women) who will thank Deepsukebe for ‘making their dreams come true’ they are blindsided by the enjoyment that they receive from it, and there should be accountability for the harm this can cause everyone.

--

--

Hannah Mercer

Founder of DragonflAI — On-Device Nudity Moderation. My mission is protect children by reducing the volume of child abuse online. www.dragonflai.co