The European Commission has proposed a new regulation that would require chat apps to look for child sexual abuse material. Critics say the proposal is much more intrusive than Apple's plans last year.
Privacy experts condemned the document in the strongest terms after it was leaked. Not an overstatement.
This looks like a shameful general #surveillance law completely unfitting for any free democracy, said Jan Penfrat of European Digital Rights.
The biggest burden would fall on popular chat apps
A broad category that includes app stores, hosting companies, and any provider of personal communications service would be covered by the regulation.
The most extreme obligations would apply to communications services. If a company in this group gets a detection order from the EU, they would have to look for child sexual abuse material as well as previously unseen CSAM.
Apple's proposal last year to look for known examples of CSAM, which limits the scope for error, would have been different. Apple removed references to the feature from its website after facing widespread criticism that it would damage the privacy of users.
The Commission claims that detection orders would be issued by individual EU nations to reduce privacy violations. The regulation is not clear about how these orders would be targeted, for example, whether they would be limited to individuals and groups or applied to much broader categories.
“It completely leaves the door open for much more generalized surveillance.”
The proposal creates the possibility for the orders to be targeted but doesn't require it.
Privacy experts say the proposal could undermine end-to-end encryption. The proposal doesn't explicitly call for an end to the services, but experts say that requiring companies to install any software the EU deems necessary to detect CSAM would make end-to-end encryption impossible. The EU's influence on digital policy elsewhere in the world could lead to similar measures being taken around the globe.
There is no way to do what the EU proposal seeks to do, other than for governments to read user messages on a massive scale.
The Commission's decision to target previously unknown examples of CSAM has been criticized. The Commission says that the use of algorithmic scanning would preserve the anonymity of targeted users. The experts say that such tools are prone to error and would lead to innocent people being surveilled by the government.
There was uproar when Apple suggested something similar. The challenges are much greater if you introduce ambiguity and these context dependent scenarios. They have been in our email for 20 years, but how many of us still get email that is not legit? That shows the limitations of these technologies.
The whole proposal is based on mandating if not impossible.