ARTICLE AD
In the latest iteration of the neverending (and always head-scratching) crypto wars, Graeme Biggar, the director general of the UK’s National Crime Agency (NCA), has called on Instagram-owner Meta to rethink its continued rollout of end-to-end encryption (E2EE) — with web users’ privacy and security pulled into the frame yet again.
The call follows a joint declaration by European police chiefs, including the UK’s own, published Sunday — expressing “concern” at how E2EE is being rolled out by the tech industry and calling for platforms to design security systems in such a way that they can still identify illegal activity and sent reports on message content to law enforcement.
In remarks to the BBC today, the NCA chief suggested Meta’s current plan to beef up the security around Instagram users’ private chats by rolling out so-called “zero access” encryption, where only the message sender and recipient can access the content, poses a threat to child safety. The social networking giant also kicked off a long-planned rollout of default E2EE on Facebook Messenger back in December.
“Pass us the information”
Speaking to BBC Radio 4’s Today program on Monday morning, Biggar told interviewer Nick Robinson: “Our responsibility as law enforcement… is to protect the public from organised crime, from serious crime, and we need information to be able to do that.
“What is happening is the tech companies are putting a lot of the information on to end-to-end encrypted. We have no problem with encryption, I’ve got a responsibility to try and protect the public from cybercrime, too — so strong encryption is a good thing — but what we need is for the companies to still be able to pass us the information we need to keep the public safe.”
Currently, as a result of being able to scan message content where E2EE has not been rolled out, Biggar said platforms are sending tens of millions of child-safety related reports a year to police forces around the world — adding a further claim that “on the back of that information we typically safeguard 1,200 children a month and arrest 800 people”. Implication being those reports will dry up if Meta proceeds expanding its use of E2EE to Instagram.
Pointing out that Meta-owned WhatsApp has had the gold standard encryption as its default for years (E2EE was fully implemented across the messaging platform by April 2016), Robinson wondered if this wasn’t a case of the crime agency trying to close the stable door after the horse has bolted?
To which he got no straight answer — just more head-scratching equivocation.
Biggar: “It is a trend. We are not trying to stop encryption. As I said, we completely support encryption and privacy and even end-to-end encryption can be absolutely fine. What we want is the industry to find ways to still provide us with the information that we need kind.”
His intervention follows a joint declaration of around 30 European police chiefs, published Sunday, in which the law enforcement heads urge platforms to adopt unspecified “technical solutions” that they suggest can enable them to offer users robust security and privacy at the same time as maintaining the ability to spot illegal activity and report decrypted content to police forces.
“Companies will not be able to respond effectively to a lawful authority,” the police chiefs suggest, raising concerns that E2EE is being deployed in ways that undermine platforms’ abilities to identify illegal activity themselves and also their ability to send content reports to police.
“As a result, we will simply not be able to keep the public safe,” they claim, adding: “We therefore call on the technology industry to build in security by design, to ensure they maintain the ability to both identify and report harmful and illegal activities, such as child sexual exploitation, and to lawfully and exceptionally act on a lawful authority.”
A similar “lawful access” mandate was adopted on encrypted by the European Council back in a December 2020 resolution.
Client-side scanning?
The European police chiefs declaration does not explain which technologies they want platforms to deploy in order to enable CSAM-scanning and law enforcement to be sent decrypted content. But, most likely, it’s some form of client-side scanning technology they’re lobbying for — such as the system Apple had been poised to roll out in 2021, for detecting child sexual abuse material (CSAM) on users’ own devices, before a privacy backlash forced it to shelve and later quietly drop the plan. (Though Apple did roll out CSAM-scanning for iCloud Photos.)
European Union lawmakers, meanwhile, still have a controversial message-scanning CSAM legislative plan on the table. Privacy and legal experts — including the bloc’s own data protection supervisor — have warned the draft law poses an existential threat to democratic freedoms, as well as wreaking havoc with cybersecurity. Critics of the plan also argue it’s a flawed approach to child safeguarding, suggesting it’s likely to cause more harm than good by generating lots of false positives.
Last October parliamentarians pushed back against the Commission proposal, backing a substantially revised approach that aims to limit the scope of so-called CSAM “detection orders”. However the European Council has yet to agree its position. So where the controversial legislation will end up remains to be seen. This month scores of civil society groups and privacy experts warned the proposed “mass surveillance” law remains a threat to E2EE. (In the meanwhile EU lawmakers have agreed to extend a temporary derogation from the bloc’s ePrivacy rules that allows for platforms to carry out voluntary CSAM-scanning — but which the planned law is intended to replace.)
The timing of the joint declaration by European police chiefs suggests it’s intended to amp up pressure on EU lawmakers to stick with the CSAM-scanning plan despite trenchant opposition from the parliament. (Hence they also write: “We call on our democratic governments to put in place frameworks that give us the information we need to keep our publics safe.”)
The EU proposal does not prescribe particularly technologies that platforms must use to scan message content to detect CSAM either but critics warn it’s likely to force adoption of client-side scanning — despite the nascent technology being immature and unproven and simply not ready for mainstream use as they see it, which is another reason they’re so loudly sounding the alarm.
Robinson didn’t ask Biggar if police chiefs are lobbying for client-side scanning specifically but he did ask whether they want Meta to “backdoor” encryption. Again, the answer was fuzzy.
“We wouldn’t call it a backdoor — and exactly how it happens is for industry to determine. They are the experts in this,” he demurred, without specifying exactly what they do want, as if finding a way to circumvent strong encryption is a simple case of techies needing to nerd harder.
A confused Robinson pressed the UK police chief for clarification, pointing out information is either robustly encrypted (and so private) or it’s not. But Biggar danced even further away from the point — arguing “every platform is on a spectrum”, i.e. of information security vs information visibility. “Almost nothing is at the absolutely completely secure end,” he suggested. “Customers don’t want that for usability reasons [such as] their ability to get their data back if they’ve lost a phone.
“What we’re saying is being absolute on either side doesn’t work. Of course we don’t want everything to be absolutely open. But also we don’t want everything to be absolutely closed. So we want the company to find a way of making sure that they can provide security and encryption for the public but still provide us with the information that we need to protect the public.”
Non-existent safety tech
In recent years the UK Home Office has been pushing the notion of so-called “safety tech” that would allow for scanning of E2EE content to detect CSAM without impacting user privacy. However a 2021 “Safety Tech” challenge it ran, in a bid to deliver proof of concepts for such a technology, produced results so poor that the cyber security professor appointed to independently evaluate the projects, the University of Bristol’s Awais Rashid, warned last year that none of the technology developed for the challenge is fit for purpose, writing: “Our evaluation shows that the solutions under consideration will compromise privacy at large and have no built-in safeguards to stop repurposing of such technologies for monitoring any personal communications.”
If technology does exist to allow law enforcement to access E2EE data in the plain without harming users’ privacy, as Biggar appears to be claiming, one very basic question is why can’t police forces explain exactly what they want platforms to implement? (Reminder: Last year reports suggested government ministers had privately acknowledged no such privacy-safe E2EE-scanning technology currently exists.)
TechCrunch contacted Meta for a response to Biggar’s remarks and to the broader joint declaration. In an emailed statement a company spokesperson repeated its defence of expanding access to E2EE, writing: “The overwhelming majority of Brits already rely on apps that use encryption to keep them safe from hackers, fraudsters, and criminals. We don’t think people want us reading their private messages so have spent the last five years developing robust safety measures to prevent, detect and combat abuse while maintaining online security.
“We recently published an updated report setting out these measures, such as restricting people over 19 from messaging teens who don’t follow them and using technology to identify and take action against malicious behaviour. As we roll out end-to-end encryption, we expect to continue providing more reports to law enforcement than our peers due to our industry leading work on keeping people safe.”
The company has weathered a string of similar calls from a string of UK Home Secretaries over the Conservative governments’ decade+ run. Just last September then Home Secretary, Suella Braverman, warned Meta it must deploy unspecified “safety measures” alongside E2EE — warning the government could use powers in the Online Safety Bill (now Act) to sanction the company if it failed to play ball.
Asked by Robinson if the government could (and should) act if Meta does not change course on E2EE, Biggar both invoked the Online Safety Act and pointed to another (older) piece of legislation, the surveillance-enabling Investigatory Powers Act (IPA), saying: “Government can act and government should act and it has strong powers under the Investigatory Powers Act and also the Online Safety Act to do so.”
Penalties for breaches of the Online Safety Act can be substantial — with Ofcom empowered to issues fines of up to 10% of worldwide annual turnover.
In another concerning step for people’s security and privacy, the government is in the process of beefing up the IPA with more powers targeted at messaging platforms, including a requirement that messaging services clear security features with the Home Office before releasing them.
The controversial plan to further expand IPA’s scope has triggered concern across the UK tech industry — which has suggested citizens’ security and privacy will be put at risk by the additional measures. Last summer Apple also warned it could be forced to shut down mainstream services like iMessage and FaceTime in the UK if the government did not rethink the expansion of surveillance powers.
There’s some irony in the latest law enforcement-led lobbying campaign aimed at derail the onward march of E2EE across mainstream digital services hinging on a plea by police chiefs against binary arguments in favor of privacy — given there has almost certainly never been more signals intelligence available for law enforcement and security services to scoop up to feed their investigations, even factoring in the rise of E2EE. So the idea that improved web security will suddenly spell the end of child safeguarding efforts is itself a distinctly binary claim.
However anyone familiar with the decades long crypto wars won’t be surprised to see double standard pleas being deployed in bid to weaken online security as that’s how this propaganda war has always been waged.