Monthly Archives: October 2022

Chatcontrol or Child Protection?

Today I publish a detailed rebuttal to the argument from the intelligence community that we need to break end-to-end encryption in order to protect children. This has led in the UK to the Online Safety Bill and in the EU to the proposed Child Sex Abuse Regulation, which has become known in Brussels as “chatcontrol”.

The intelligence community wants to break WhatsApp, as that carries everything from diplomatic and business negotiations to MPs’ wheeling and dealing. Both the UK and EU proposals will take powers to mandate scanning of both text and images in your phone before messages are encrypted and sent, or after they are received and decrypted.

This is justified with arguments around child protection, which require careful study. Most child abuse happens in dysfunctional families, with the abuser typically being the mother’s partner; technology is often abused as a means of extortion and control. Indecent images get shared with outsiders, and user reports of such images are a really important way of alerting the police to new cases. There are also abusers who look for vulnerable minors online, and here too it’s user reporting that does most of the work.

But it costs money to get moderators to respond to user reports of abuse, so the tech firms’ performance here is unimpressive. Facebook seems to be the best of a bad lot, while Twitter is awful – and so hosts a lot more abuse. There’s a strong case for laws to compel service providers to manage user reporting better, and the EU’s Digital Services Act goes some way in this direction. The Online Safety Bill should be amended to do the same, and we produced a policy paper on this last week.

But details matter, as it’s important to understand the many inappropriate laws, dysfunctional institutions and perverse incentives that get in the way of rational policies around the online aspects of crimes of sexual violence against minors. (The same holds for violent online political extremism, which is also used as an excuse for more censorship and surveillance.) We do indeed need to spend more money on reducing violent crime, but it should be spent locally on hiring more police officers and social workers to deal with family violence directly. We also need welfare reform to reduce the number of families living in poverty.

As for surveillance, it has not helped in the past and there is no real prospect that the measures now proposed would help in the future. I go through the relevant evidence in my paper and conclude that “chatcontrol” will not improve child protection, but damage it instead. It will also undermine human rights at a time when we need to face down authoritarians not just technologically and militarily, but morally as well. What’s the point of this struggle, if not to defend democracy, the rule of law, and human rights?

Edited to add: here is a video of a talk I gave on the paper at Digitalize.

ML models must also think about trusting trust

Our latest paper demonstrates how a Trojan or backdoor can be inserted into a machine-learning model by the compiler. In his Turing Award lecture, Ken Thompson explained how this could be done to an operating system, and in previous work we’d shown you you can subvert a model by manipulating the order in which training data are presented. Could these ideas be combined?

The answer is yes. The trick is for the compiler to recognise what sort of model it’s compiling – whether it’s processing images or text, for example – and then devising trigger mechanisms for such models that are sufficiently covert and general. The takeaway message is that for a machine-learning model to be trustworthy, you need to assure the provenance of the whole chain: the model itself, the software tools used to compile it, the training data, the order in which the data are batched and presented – in short, everything.

Assistant/Associate Professor in Security and Privacy

The Department of Computer Science and Technology is hiring six new faculty members, including an Assistant or Associate Professor in the area of Privacy and/or Security.


The Department is one of the world leaders in computer security, with outstanding historic contributions (such as the Needham-Schroeder protocol and the economics of computer security), as well as vibrant current research (the Cambridge Cybercrime Centre, CHERI processor architecture, and hardware tamper lab). Security is one of the ten core research themes in the department. We take a holistic and interdisciplinary view of the topic, so while we look in detail at many of the technical areas, we also work across traditional subject boundaries to tackle major challenges.


We are looking for someone who can demonstrate they are capable of world-class research which will complement existing expertise in the Department. Given the fast-moving nature of the field, evidence of breadth and flexibility in research is expected.


We aim to substantially broaden coverage of security-related research and teaching in the Department and we welcome applications relating to a wide range of security and privacy topics, including cryptography, cryptographic protocols and verification, distributed-systems security, malware analysis, forensics, machine learning, privacy, software security, computer hardware security, human factors, ledger technologies, and security economics.


The full details are available here.

The Online Safety Bill: Reboot it, or Shoot it?

Yesterday I took part in a panel discussion organised by the Adam Smith Institute on the Online Safety Bill. This sprawling legislative monster has outlasted not just six Secretaries of State for Culture, Media and Sport, but two Prime Ministers. It’s due to slither back to Parliament in November, so we wrote a Policy Brief that explains what it tries to do and some of the things it gets wrong.

Some of the bill’s many proposals command wide support – for example, that online services should enable users to contact them effectively to report illegal material, which should be removed quickly. At present, only copyright owners and the police seem to be able to get the attention of the major platforms; ordinary people, including young people, should also be able to report unlawful things and have them taken down quickly. Here, the UK government intends to bind only large platforms like Facebook and Twitter. We propose extending the duty to gaming platforms too. Kids just aren’t on Facebook any more.

The Bill also tries to reignite the crypto wars by empowering Ofcom to require services to use “accredited technology” (read: software written by GCHQ contractors) to scan your WhatsApp messages. The idea that you can catch violent criminals such as child abusers and terrorists by bulk text scanning is entirely implausible; the error rates are so high that the police would swamped with false positives. Quite apart from that, bulk intercept has always been illegal in Britain, and would also contravene the European Convention on Human Rights, to which we are still a signatory despite Brexit. This power to mandate client-side scanning has to be scrapped, a move that quite a few MPs already support.

But what should we do instead about illegal images of minors, and about violent online political extremism? More local policing would be better; we explain why. This is informed by our work on the link between violent extremism and misogyny, as well as our analysis of a similar proposal in the EU. So it is welcome that the government is hiring more police officers. What’s needed now is a greater focus on family violence, which is the root cause of most child abuse, rather than using child abuse as an excuse to increase the central agencies’ surveillance powers and budgets.

In our Policy Brief, we also discuss content moderation, and suggest that it be guided by the principle of minimising cruelty. One of the other panelists, Graham Smith, discussed the legal difficulties of regulating speech and made a strong case that restrictions (such as copyright, libel, incitement and harassment) should be set out in primary legislation rather than farmed out to private firms, as at present, or to a regulator, as the Bill proposes. Given that most of the bad stuff is illegal already, why not make a start by enforcing the laws we already have, as they do in Germany? British policing efforts online range from the pathetic to the outrageous. It looks like Parliament will have some interesting decisions to take when the bill comes back.