Lockdown Nudity Detection Race — The Results!
After a couple suspenseful weeks waiting to see who would win the Lockdown Nudity Detection Race, expecting notifications saying we violated rules and checking to see if our content was still visible, the results are in. So, without further ado, let’s have a look at which companies you thought would do a good job at stopping this content from being published, and then see if the ‘horses’ you backed in our poll ended up doing well.
With 30% of total votes, Facebook was the public front runner for removing or stopping the nude image from being uploaded, followed closely by Instagram and Lego Life both with 23%. The remaining contenders were all fairly close, hovering between 2% and 7%, with the only exceptions being Snapchat and Tumblr which were unanimously voted to be the worst with not a single vote going to either. With the favourites and dark horses locked in, we looked at how the companies performed over the past two weeks, hoping it would be easy to rank all ten. It was not.
Want to read this story later? Save it in Journal.
Given the variety of results, and how similar some of the top performers were, and the fact that a few of these sites didn’t remove the content at all, ranking our contenders was difficult, so we ended up with a few ties, and a few we deemed to be ‘crashes’ since they could hardly be considered at all.
1st Place — Yubo, Lego Life
Both of these sites did an excellent job of not allowing the content to be uploaded at all, which is especially important for Lego Life which is a kid focused platform. Kudos to our joint winners, and to everyone in the poll who thought Lego Life would do well.
2nd Place — Tumblr
We were just as surprised as you might be, and with the amount of negative press Tumblr has gotten in the past for indecent and illegal content being hosted on the platform the 0% on the poll wasn’t a shock. But their total ban on pornographic content seems to have been upheld, and we thought they deserved a second place spot given the image was never made public, although it remains on the site, hidden on our account.
3rd Place — Instagram
Instagram did well by stopping our image from being shared, but we did manage to get it onto our story for the duration where it could be viewed. Not fantastic, but the failure of many other platforms here helped Instagram get a podium spot.
There was certainly a drop off after the top three places in how well the remaining companies handled nude content being uploaded, which was surprising for some, and certainly disappointing from our perspective.
4th Place — Tik Tok
After our investigation into Tik Tok which led to this whole race idea, we were certainly curious to see how well they would do, and after about 24 hours we knew. While 24 hours may not seem like a long time, considering the number of users on the app, child abuse imagery could be published and seen by a huge number, both voluntarily and on accident, so we don’t feel Tik Tok really deserves much praise for this.
5th Place — Snapchat
While we would deem this as a failure since the content was uploaded and stayed on the platform until it automatically expired after 24 hours, we had to rank Snapchat above the remaining sites because there was a time limit and the image needed to be re-uploaded to stay visible.
Last Place — Facebook, Twitter
While both these sites identified our image as indecent or nude content, it is still on the platforms and can be viewed by anyone, we class that as a loss. So for the 30% of responses in favour of Facebook on our poll, don’t expect a happy call from the bookies. For two of the largest and most popular social media sites with immense resources to work on this problem, they didn’t manage to stop nude content from being public on their platforms, even though they seem to be able to identify it.
DNF — Amino, Pinterest
No upload blocking, image removal, or even identification on either of these sites, so having ‘crashed’ out of the race, we gave them a Did Not Finish result for the total lack of moderation.
Thank you all for participating in DragonflAI’s first annual Lockdown Nudity Detection Race. Stay tuned for our next competition open to the public where we will be asking you to send us images (non-nude) of objects you think could be confused for a more private human anatomy, and see how well our algorithm does to detect it correctly!
If you want a slightly more detailed look at how each individual site handled the content we tried to publish, look below.
Tik Tok
We uploaded in video form with no issue initially, but after roughly 24 hours the video was taken down due to community guideline violations, and i.am.under.16 was blocked from posting for one week after the notification.
Snapchat
We filmed the image for ten seconds and added this to our public story, where it stayed for the entire 24-hour life of the Snapchat story. We repeated this process three additional times with no change to the result.
When we tried to upload the image to our Instagram profile, it was removed immediately from being shared, but we did manage to keep it on our story, similar to Snapchat.
Once uploaded to Facebook, the image was flagged as ‘possibly’ inappropriate and in violation of community guideline, however we were given the option to click ‘ ignore’ and complete the upload, where it remains publicly visible.
The image was posted to twitter briefly, then flagged as sensitive material. After being flagged, the content was hidden from the main feed, but we were able to view the content from another device without even needing to sign in by simply clicking we wanted to view the sensitive material.
Added to a board on the app with no issues, and remains uploaded and public.
Lego Life
When we tried to share the image it was immediately sent to moderation, and was never published.
Amino
We set the image to be both our profile image and background image with no issue, and the content is still visible to the public.
Yubo
Immediately after trying to upload the image, it was tagged as inappropriate and couldn’t be uploaded.
Tumblr
Our content was tagged as inappropriate before the image could be uploaded, and moved to hidden pending an appeal. The appeal did not go through so the image was never published, but could still be seen privately on our account.
📝 Save this story in Journal.
👩💻 Wake up every Sunday morning to the week’s most noteworthy stories in Tech waiting in your inbox. Read the Noteworthy in Tech newsletter.