An independent legal analysis of a controversial UK government proposal to regulate online speech under a safety focused framework says the bill contains some of the broadest mass surveillance powers over citizens every proposed in a Western democracy which it also warns pose a risk to the integrity of end-

The opinion was written by a barrister and was commissioned by a group that supports freedom of expression.

He was asked if the provisions in the bill were compatible with the law.

Without further amendment, the bill will likely violate the European Convention on Human Rights.

The bill was put on hold over the summer and again in October due to political turbulence in the Conservative Party. After the arrival of a new digital minister, and two changes of prime minister, the government has indicated it intends to make amendments to the draft, however these are focused on provisions related to so-called 'legal but harmful' speech.

UK to change Online Safety Bill limits on ‘legal but harmful’ content for adults

The issues raised by his legal opinion were responded to by the Home Office.

The government replied with an email from the minister for security.

“The Online Safety Bill has privacy at the heart of its proposals and ensures we’re able to protect ourselves from online crimes including child sexual exploitation. It‘s not a ban on any type of technology or service design.

“Where a company fails to tackle child sexual abuse on its platforms, it is right that Ofcom as the independent regulator has the power, as a last resort, to require these companies to take action.

“Strong encryption protects our privacy and our online economy but end-to-end encryption can be implemented in a way which is consistent with public safety. The Bill ensures that tech companies do not provide a safe space for the most dangerous predators online.”

The bill gives the state sweeping powers to compel digital providers to surveil users' online communications "on a generalised and widespread basis", yet fails to include any form of independent prior authorisation.

This lack of oversight is likely to violate the right to privacy and the right to freedom of expression of the ECHR, according to Ryder.

The Investigatory Powers Act 2016 contains legal checks and balances for authorizing the most intrusive powers, which include the judiciary.

A public body that is not adequately independent for this function is left out of the Online Safety Bill.

The statutory scheme doesn't make provision for independent authorisation for 104 Notices, even though it may require private bodies to carry out mass state snooping of millions of user's communications. There isn't any provision for independent oversight. Ofcom can't be considered an independent body in this situation.

The Online Safety Bill may not be necessary in a democratic society because it may not meet a key human rights test.

The Online Safety Bill, which his legal analysis argues grants similar "mass-surveillance" powers to Ofcom, covers a lot more. It looks like it's lessBounded.

Ruth Smeeth, chief executive of index on censorsorship, denounced the bill's overreach in a statement.

“This legal opinion makes clear the myriad issues surrounding the Online Safety Bill. The vague drafting of this legislation will necessitate Ofcom, a media regulator, unilaterally deciding how to deploy massive powers of surveillance across almost every aspect of digital day-to-day life in Britain. Surveillance by regulator is perhaps the most egregious instance of overreach in a Bill that is simply unfit for purpose.”

Impact on E2EE

While the Online Safety Bill has focused on risks to freedom of expression, there are a number of other concerns. Content scanning provisions in the legislation could affect E2EE, with critics like the Open Rights Group warning the law will essentially strong-arm service providers into breaking strong encryption

Concerns have increased since the bill was introduced after a government amendment which proposed new powers for Ofcom to force messaging platforms to implement content- scanning technologies even if comms are strong on their service Private comms puts it on a collision course with E2EE because the amendment stipulated that a regulated service could be required to develop or source technology for detecting and removing CSEA.

UK could force E2E encrypted platforms to do CSAM-scanning

E2EE is the gold standard for online security and is found on a number of popular messaging platforms.

Any laws that threaten use of this standard, or open up new vulnerabilities for E2EE, could have a huge impact on web users' security around the world.

The online safety bill's content scanning provisions are creating an "existential risk" for E2EE according to the legal opinion.

Clause 104 of the bill gives Ofcom the power to issue notices to in-scope service providers requiring them to identify and take down terrorism content that is communicated publicly. The inclusion of private comms is something that looks really sticky for E2EE.

The bill, rather than forcing messaging platforms to abandon E2EE altogether, will push them towards deploy a controversial technology called client side scanning as a way to comply with 104 Notices issued by Ofcom.

The clause does not refer to any technology by name. There is only one mention of accredited technology. The practical implementation of 104 Notices requiring the identification, removal and/or blocking of content leads almost inevitably to the concern that this power will be used by Ofcom to mandateCSPs using some form ofCSS This description is compatible with the style ofCSS.

He also points to an article published by two senior GCHQ officials this summer, which he says "endorsedCSS as a potential solution to the problem of CSEA content being transmitted on secured platforms."

Any attempt to requireCSPs to undermine their implementation of end-to-end encryption generally would have far-reaching implications for the safety and security of all global on-line communication. He warns that we are unable to imagine circumstances where such a destructive step in the security of global online communications could be justified.

Client side scanning risk

Controversial scanning technology in which the content of communications is scanned in order to identify objectionable content is referred to asCSS. The process involves converting a message to a digital fingerprints and comparing it to a database of fingerprints to see if there are any matches with known objectionable content. The user's own device or a remote service can be used to compare their fingerprints.

Privacy and security experts argue thatCSS breaks the E2E trust model since it fundamentally defeats the zero knowledge purpose of end-to-end encryption and creates new risks.

They point to the possibility of embedded content-scanning infrastructure enablingcensorship creeps, as a state could mandate comms providers to scans for an increasingly broad range of 'objectionable' content, from copyrighted material all the way up to expressions of political dissent that are displeasing to an

When Apple announced it would begin scanning iCloud Photo uploads for known child abuse imagery, it caused a huge backlash. In December, Apple quietly dropped references to the plan, so it appears to have abandoned it. The UK's Online Safety Bill, which relies on the same claimed child safety justification as the deployment ofCSS, could be used to revive such moves.

The UK Home Office has supported the development of content-scanning technologies which could be applied to E2EE services in the past.

Five winning projects were announced last year. These prototypes are not clear how they were developed. The government is moving ahead with Online Safety legislation that this legal expert believes will require E2EE platforms to carry out content scanning and drive adoption ofCSS regardless of the state of development of such tech.

UK names five projects to get funding for CSAM detection

The proposed amendment to Clause 104 would allow Ofcom to require service providers to use their own content- scanning technology to achieve the same purposes as accredited technology. It is highly unlikely thatCSPs would try to remove end- to-end encryption from their services. They need to analyse the content of communications in order to find relevant content. This would cause a lot of users to switch to other services, as it would compromise security for their users.

The content of almost all internet-based communications by millions of people, including the details of their personal conversations, would be constantly surveilled by service providers. It depends on how Ofcom uses its power to issue 104 Notices but the inherent tension between the apparent aim and the need for proportionality is self-evident.

Failure to comply with the Online Safety Bill will cause service providers to be at risk of a range of severe penalties, so large sticks are being assembled and put in place.

Up to 10% of global annual turnover can be fined. The bill would allow Ofcom to block non- compliant services in the UK market. Senior executives at providers who don't cooperate with the regulators could be charged with a crime.

The UK government has downplayed the impact of the legislation on E2EE.

Content scanning technology would only be mandated by Ofcom as a last resort according to a government fact sheet. There is no evidence in support of the assertion that these scanning technologies will be "highly accurate". It says that the power will be subject to strict safeguards to protect users' privacy. There is evidence of a widespread problem on a service and Ofcom needs to be certain that no other measures would be as effective.

It's questionable if novel artificial intelligence will be highly accurate for a wide-ranging content scanning purpose.

Thousands of human contractors still review automated reports because of how blunt a tool artificial intelligence has proven to be for moderation. It seems highly fanciful that the Home Office will be able to foster the development of a far more effective artificial intelligence filter than has been done by tech giants over the years.

He questions whether Clause 105 of the bill is enough to address the full sweep of human rights concerns attached to the power.

He suggests that Clause 105 of the OLSB has other safeguards, but they will be dependent on how they are applied in practice. There is no indication as to how the safeguards will be applied by Ofcom.

Clause 105(h) requires consideration to be given to interference with the right to freedom of expression. There is no specific provision to make sure the adequate protection of journalistic sources is provided.

The Home Office emphasized that Section 104 Notice powers will only be used where there is no alternative, less intrusive measures capable of achieving the necessary reduction in illegal criminal activity on the service.

UK government denies fresh delay to Online Safety Bill will derail it