The EU wants to end encryption. It doesn’t end today
Don’t be lulled into a sense of false security — powerful forces inside and out of the EU continue to push to end encryption. Not for the children, but for profit and control.
CSO Online (and the Verge) reported this week that EU legislation that has been long in the works was finally heading to a vote: Chat Control, a bill that would require the scanning of content in all messaging apps.
CSO was accurate in its labeling of this proposed law as a ‘surveillance bill’. While some parts of the European Union seem to understand that private, encrypted communications are a crucial cornerstone of modern democracies, other parts did not.
Parts? You can be excused, even as a European myself, if you don’t entirely understand the organizational structure of the EU. Its parliament still moves its entire office and staff between Brussels and Strasbourg monthly, at a cost of €114 million a year. To illustrate, that would be like US Congress moving between Washington D.C. and Cleveland every month. It is an almost comically bureaucratic institution layered with commissions and committees.
Much like a nuclear powered aircraft carrier, it is hard to stop, and it is capable of doing good and disastrously bad things.
The proposal in question:
The proposed upload moderation mechanism is designed to guard against the abuse of messaging platforms to share child sexual abuse material. Content would be scanned before it is encrypted using as yet undefined methods.
The proposed EU regulation covers encrypted messaging apps, email services, cloud storage, and any platform that allows sharing of messages, images or videos.
The European Parliament and European Council must reach a compromise with a third body, the European Commission, before their text can become law.
I actually take exception to the “as yet undefined methods” — as anything you might come across in tech today, the head pusher of this absolutely terrible legislation is, of course, using “AI” as a technology that can solve this thorny issue. Details on it are scant, but interviews with the lawmakers are revelatory.
Ylva Johansson, an EU commissioner in charge of the bill demonstrates once again that the EU legislators that aim to legislate tech have absolutely no understanding of what they are writing laws on:
In a podcast interview in the Swedish newspaper Svenska Dagbladet, Ylva Johansson claimed, among other things, that scanning for child abuse content in encrypted communication is equivalent to scanning for viruses and that encrypted communication can be scanned without breaking the encryption. She also said that “if you’re on Signal, and you want to send me a link to an interesting Svenska Dagbladet article … when you start typing the address of the article, a picture of the article pops up and that’s because they’re scanning the conversation”.
(via Mullvad VPN)
That... is not at all how Signal works. The entire interview is almost comedic in tone with how poor of a grasp on technology is on display here. Johansson likens the reading of encrypted messages as a dog sniffing a suspicious bag. No need to open the bag! The dog can sniff it. She claims the bill is also not about encryption at all. More on that later.
Well, at least they’ll only use it for the safety of children, I guess? I have a daughter, I do want her to be safe.
At a press conference that Dagens Nyheter was broadcasting Ylva Johansson talked about the chat control proposal as well as the drug problems within EU. Ylva Johansson told the press “they use snapchat for the actual deal” and then talked about using chat control to combat drug dealing. It’s not a wild guess that Ylva Johansson and the EU Commission want to extend the usage of the chat control system. The only question is, where will it end?
Where will it end, indeed. A European Parliament member already coyly suggested scanning for ‘drag queens’. If you didn’t see a chilling precedent before, I hope you understand it now. Is this for the safety of children?
Make no mistake about the today’s news: the vote of the law being delayed does not mean encryption in Europe is safe.
Celebrations aren’t just premature, they are false releases of pressure that the Commission hopes will blunt future outrage at new proposals. This has happened for years, and it will come back again until it succeeds — whether through more legislative control on foreign tech companies, reactive legislation following a major terror attack or high profile sex crime, or another creative way. I have seen this come and go since I was a teenager living in the Netherlands. It’s systemic.
The European Office of Home Affairs, which is the driver behind the legislation has a page on encryption. It’s an elucidating read:
As technologies are becoming easier to access and use, even without any technical skills can use encryption to avoid detection. According to the 2018 Europol Internet Organised Crime Threat Assessment (iOCTA), law enforcement authorities increasingly encounter encryption in their criminal investigations. It is one of the major challenges to address, as it denies access to essential intelligence and evidence.
The Commission, quite bluntly, sees encryption as a ‘challenge to address’, not a feature of modern democracy.
But that’s not all of the story. Law enforcement organizations have always seen private communication between citizens as a bug, rather than a feature, or modern technology. No news there. What other motivations are pushing this recurring legislation, then?
Balkan Insight led a comprehensive investigation into the coalition aligned behind the proposal, composed of a network of powerful lobbying organizations that also happen to sell the “AI scanning solutions” that would magically solve these problems. Quote:
It’s a synthesis that granted certain stakeholders, AI firms and advocacy groups – which enjoy significant financial backing – a questionable level of influence over the crafting of EU policy.
You should really read the entire thing. The network is complex, but once you get around to reading the piece it all makes a lot more sense. Millions of dollars of lobbying money is being poured into making this happen. The organizations behind the scanning technology get unprecedented access to MEPs and the EU legislative bodies.
And organizations helping victims?
Some 200 kilometres north from Brussels, in the Dutch city of Amsterdam, a bright office on the edge of the city’s famous red light district marks the frontline of the fight to identify and remove CSAM in Europe.
...
In 2022, its seven analysts processed 144,000 reports, and 60 per cent concerned illegal content. The hotline sends requests to remove the content to web hosting providers and, if the material is considered particularly serious, to the police and Interpol.
Offlimits director between 2015 and September this year, Arda Gerkens is deeply knowledgeable of EU policy on the matter. Yet unlike the likes of Thorn, she had little luck accessing Johansson.
“I invited her here but she never came,” said Gerkens, a former Socialist Party MP in the Dutch parliament.
“Commissioner Johansson and her staff visited Silicon Valley and big North American companies,” she said. Companies presenting themselves as NGOs but acting more like tech companies have influenced Johansson’s regulation, Gerkens said, arguing that Thorn and groups like it “have a commercial interest”.
Commercial interests combined with large, unaccountable government entities pushing for total information awareness should concern you.
Signal president Meredith Whittaker rang the alarm, publishing a letter and putting it simply:
Effectively, such AI firms are offering tech companies a “get out of responsibility free card”, Whittaker said, by telling them, “’You pay us (…) and we will host the hashes, we will maintain the AI system, we will do whatever it is to magically clean up this problem”. (...) I don’t think governments understand just how expensive and fallible these systems are, that we’re not looking at a one-time cost. We’re looking at hundreds of millions of dollars indefinitely due to the scale that this is being proposed at.”
According to a report by MarketsandMarkets, the global market for AI cybersecurity content scanning technologies is projected to grow from USD 22.4 billion in 2020 to USD 46.3 billion by 2025.
This is more than just CSAM detection, but the EU clearly wants more than just CSAM detection. I find this conservative. Content is being generated at an unprecedented rate, and technology continues to take off. It just happens that such profits aren’t being reaped in Europe.
I’ve made it pretty public that I am not a fan of the EU’s legislating Apple and other US tech companies through its Digital Markets Initiative. Part of this is not that there’s no consumer benefits — though, I think it is minimal — it is that once you let the EU start getting controls over the design and functionality of software, it’s very hard to go back.
At least through this legislation, you can see what their software designs would look like. I hope Europeans keep up the fight.
A footnote: Johansson has stated for years that encryption would somehow not be broken entirely by this proposal, but today finally took the mask off:
Commissioner for Values & Transparency: "the Commission proposed the method or the rule that even encrypted messaging can be broken for the sake of better protecting children”
via Ella Jakubowska on Twitter
Commissioner for Values and Transparency, indeed.
When people tell you who they are and what they stand for, you should believe them.
Newsletter
Sign up to get my best articles and semi-regular newsletter delivered in your inbox as they are released.
Membership
Receive my annual photography or design course, access to previous courses, my Lightroom presets, premium articles and more.