Meta delays E2EE rollout
After planning to introduce end-to-end encryption (E2EE) on messaging across Facebook and Instagram, Meta, now as their parent company, has announced this will not be rolled out until 2023.
The intended process would allow the sender and receiver to view the message but not any law enforcement services or even Meta themselves. Due to this level of confidentiality, it has raised concerns with politicians and child safety groups who believe that it would prevent police from investigating online child abuse material. Whatsapp, which is another child of Meta, already uses E2EE unlike its siblings Facebook and Instagram.
E2EE works by mixing up or encrypting data as it travels between devices- making it that the only way to read the message is by gaining physical access to the device it was either sent from or received by. The technology has grown in popularity for messaging services, however it can be assumed that for platforms like Facebook and Instagram, who’s messaging service is simply seen as an add on to the main purpose of photo and video sharing, it has not been a priority for Mark Zuckerberg to make these apps’ messaging elements encrypted.
NSPCC (The National Society for the Prevention of Cruelty to Children) has highlighted that encrypted messaging services are the most used by those who are sharing child sexual abuse material (CSAM), with even UK Home Secretary Priti Patel also criticising the technology stating that it could cause serious problems for law enforcement who are investigating criminal activitiy, including online child abuse. NSPCC asked 46 police forces from the UK about the platforms used to commit child sexual offences in 2020. The responses revealed that 9,470 instances of child sexual abuse were reported to the police, with 52% of these taking place on Facebook- now Meta- owned applications. Over a third took place on Instagram, 13% on Facebook and messenger, but very few occurred on Whatsapp, probably due to the fact that it does use E2EE. The important thing to remember about the 9,470 reported incidents is that this total is not a true reflection of the actual amount of CSAM that is out there, that number is almost unimaginable, and far, far larger.
However, these figures have sparked concern that the encryption of Facebook’s Messenger and Instagram’s direct messages could enable those who are sharing this type of material to go undetected and get away with it (more than they already are), increasing the amount of CSAM readily available and being published online. For E2EE to be effective for all, it is about the balance between safety and privacy. The argument for E2EE is that it protects user’s privacy, as it prevents governments and potential hackers from getting a hold of personal information and conversations. Raising the age-old question- if you have nothing to hide, why aren’t you sharing it?
Meta’s Global Head of Safety- Antigone Davis- stated that the delay in implementing encryption to 2023, was simply down to the organisation wanting to take it’s time to make sure that it was getting everything right, despite previously indicating that this change would occur in 2022, at the latest.
“As a company that connects billions of people around the world and has built industry-leading technology, we’re determined to protect people’s private communications and keep people safe online.”- Antigone Davis
As well as making this statement virtually during a US Senate Hearing on child online safety, Davis also set out additional preventative measures that Meta had already put in place; starting with their “proactive detection technology” that scans for suspicious patterns of activity- for example a user who repeatedly sets up new profiles or messages a large number of users that they are not following or are friends with. Secondly, making all accounts held by under 18s default private or ‘friends only’, restricting adults from messaging them if they are not already connected. Lastly, they claim to already be educating children with in-app tips on how to avoid unwanted messages from those they do not know. The steps already taken by the company indicate that there is a level of responsibility being taken towards the safety of its users.
NSPCC’s Head of Child Safety made comment on the delay- sharing his thanks that the delay had occurred, as he believed that Meta should only proceed with the implementing of E2EE if they can show that the technology will ensure that children are not at a greater risk of abuse. This came after NSPCC led a global coalition with 130 other child protection organisations, where they discussed the dangers of end-to-end encryption of social media messaging services.
We understand why there is hesitation when it comes to the use of E2EE when it comes to the sharing of CSAM, however it is important to note that there are also benefits to the likes of Meta implementing this type of technology. If the right balance is struck at Meta between privacy and safety of users then there is hope that there will not be an increase in illegal content sharing. At DragonflAI, we recognise the levels of CSAM that are passed through messaging services that utilise E2EE, therefore we have made sure that our technology works offline, on-device and with E2EE, in order to highlight more of the CSAM online that is currently being reported, as well as stopping more from being uploaded or shared through social media and particularly through the likes of Whatsapp, who have begun to use this type of encryption as they believe it is the best thing for their users.