5 Reasons social networks should be using client-side moderation
Client-side or device-level moderation is the future of moderation as a result of its cost effective nature, rapid processing, and end to end safety, but it’s not some far off ideal. On-device moderation is here already — and here are five reasons you should make the switch.
1. Instantly Block or Approve Content
Delays in content being posted to a social platform or shared can often come down to the time it takes to upload content to a server, have it be checked, and then send an approval or rejection back. This is also highly variable depending on server load, and can be especially long if humans are moderating the content.
When users have to wait minutes (if not hours in some cases) to share a picture, they may look elsewhere to instantly share their content. There is no delay from uploading content when moderating on-device, and computation time can be less than 40ms. Quick moderation leads to happy users sharing more and enjoying an app to its fullest.
2. User Privacy is Important
Moderation involving a third-party (human moderation, sending content to servers, etc.) compromises user privacy somewhat simply by involving an additional person or device.
While this may not be as important for social media sites as much of the content is shared publicly, sending private messages can be difficult when still wanting to moderate content going through your platform.
Eliminating the reliance on external involvement means the self-contained nature of on-device moderation solutions work with E2EE, preventing users from sending illegal or obscene images publicly and privately.
3. Keep Moderation Costs Low Without Sacrificing Safety
By harnessing the power of users’ devices to run image analysis, the computation can be done for free. Eliminating server storage costs, computing costs, and pricing per-image moderation.
There really isn’t much more say about the cost of on-device moderation versus typical server moderation or human moderation services, it just is MUCH cheaper, and is just as good.
Maintaining a safe platform cheaply is easier than ever now, and while we appreciate cost does factor into not moderating, keeping users safe is more important, and now there’s no excuse.
4. Reduced Liability from Server Hosted Content
Anytime content is uploaded to a server to be moderated, there is always a risk that illegal content such as CSAM may be saved to the company server, creating a liability nightmare and requiring a complete shutdown of the server hosting the content until it can be effectively ‘cleaned.’
Without needing servers, on-device moderation eliminates this liability risk by never having potentially harmful content leave the device if it is classified as indecent.
5. Mental Health of Moderators
Human moderators are subjected to viewing some of the internet’s most heinous content, some of which goes beyond indecent into fully illegal and abusive. This is a tough job, and limiting the amount of moderation done by humans to only content that is appealed by individuals who believe their content should be allowed onto a platform will certainly help ensure PTSD from moderation work stops.
The recent example of Facebook moderators suffering from the mental and emotional trauma that is attributed to being exposed to images and videos showing graphic pornography, abuse, murder, and other forms of hate and violence on a daily basis should be enough to prove the value of on-device moderation.
While there are many more reasons on-device moderation will certainly replace servers and primarily human moderation, we think these five reasons are more than enough to make the switch.
DragonflAI has built our entire company around keeping people safe, from users to moderators. Outlined below is our entire offline and simple process for moderating content. If you want to reduce your liability and moderation costs, join our beta now.