TikTok launches new age restrictions to protect… themselves
Last week TikTok rolled out new child safety restrictions which seemed like a good idea on the surface — restricting gift sending to only those over 18, and removing direct messaging for everyone under 16. This is ontop of their YouTube series about using the app safely and working towards creating better safety policies in their new Dublin hub.[1] However, do not be fooled.
“we know that groomers use direct messaging to cast the net widely and contact large numbers of children.”
— Andy Burrows, NSPCC
Ofcom reported that 13% of TikTok’s 800 million active monthly users are aged 12–15,[2] which is a substantial percentage to be banning from a key feature of the app. Whilst the main focus of the app has always been short videos, offering direct messaging between accounts that follow each other allows for even more social connections among users. This latter feature does add to the risk that underage children may be influenced into creating indecent content, or even groomed by manipulative users, which has been highlighted by Andy Burrows, the head of the NSPCC, who stated to the BBC that “we know that groomers use direct messaging to cast the net widely and contact large numbers of children.”[3]
While DragonflAI certainly supports the efforts to protect children, we were skeptical about the efficacy of this age restriction as there is no verification required to create an account, leading us to believe TikTok is following in the footsteps of many other social medias sites to create seemingly great user safety protocols that do little more than keep regulatory bodies off their back.
Creating User “I.Am.Under.16”
We decided to test the age input, and see what we needed to do to access TikTok and be exposed to inappropriate content for children, so the DragonflAI TikTok account was born… in 1952.
Aptly named ‘I.am.under.16’, we were excited to find that this username was in fact, available.
Going through the sign-up process, we found no real verification for our fake age other than sliding a puzzle piece into a slot, we didn’t even need to verify the email address before starting to scroll through content, so having a real email doesn’t appear necessary.
#Quarantitties
While much of the content we saw was not expressly explicit as you can see above, although very sexualised, we were only scrolling through the app for a couple minutes, and did see a # challenge called “quarantitties’, which involves girls flashing the public from afar. Certainly not appropriate for someone under-16.
The lack of verification for age is surprising given the point of these new restrictions is to stop everyone under 16 from engaging in direct messaging, and shows that TikTok is not really concerned with the possibility of kids creating accounts with fake ages, and messaging with strangers who may be looking to take advantage of them.
At DragonflAI we are in full support of companies introducing child protection measures for their apps or social platforms, but TikTok’s new policies seem to have missed the mark, and appear to be more self-protecting than child-protecting.
We hope that companies like this soon realize the desire to protect themselves from backlash about underage children creating and sharing indecent content does not compare to taking meaningful actions that can actually help children stay safe and enjoy these apps responsibly. Whilst we admire their creation of the Dublin Hub and preventing messages of those under-16, we hope that this begins the longer process of legitimate child protection.
With our software, TikTok could prevent nudity being uploaded at source by moderating content on the device, rather than waiting for it to reach the platform, and we welcome that discussion to help keep children safe.
[1] https://newsroom.tiktok.com/en-gb/why-we-re-making-changes-to-direct-messaging
[2] https://www.ofcom.org.uk/__data/assets/pdf_file/0023/190616/children-media-use-attitudes-2019-report.pdf