John Brockman of Edge interviewed me in London in March. The video of the interview, and a transcript, are now available on the Edge website. Edge runs big interviews with several dozen scientists a year, with particular interest in people who do cross-disciplinary work. For me, the interaction of economics, psychology and engineering is one of the things that makes security so fascinating, as well as the creativity driven by adversarial behaviour.
The topics covered include the last thirty years of progress (of lack of it) in information security, from the early beginnings, through the crypto wars and crime moving online, to the economics of security. We talked about how cryptography can help less developed countries; about managing complexity in big projects; about how network effects lead firms to design insecure products; about whether big data can undermine democracy by empowering elites; and about how in a future world of intelligent things, security may become more about safety than anything else. Finally I talk about our current big project, the Cambridge Cybercrime Centre.
John runs a literary agency, and he’s worked on books by many of the scientists who feature on his site. This makes me wonder: on what topic should I write my next book?
I’m at the twenty-fifth Security Protocols Workshop, of which the theme is protocols with multiple objectives. I’ll try to liveblog the talks in followups to this post.
2016 might not have been the best of years, so instead of doing a Christmas card I’ve written a magical fantasy story instead.
A happy Christmas to all, and here’s hoping we have a better 2017.
Now that everyone’s distracted with the supreme court case on Brexit, you can expect the government to sneak out something it’s ashamed of. Health secretary Jeremy Hunt has decided to ignore the wishes of over a million people who opted out of having their hospital records given to third parties such as drug companies, and the ICO has decided to pretend that the anonymisation mechanisms he says he’ll use instead are sufficient. One gently smoking gun is the fifth bullet in a new webpage here, where the Department of Health claims that when it says the data are anonymous, your wishes will be ignored. The news has been broken in an article in the Health Services Journal (it’s behind a paywall, as a splendid example of transparency) with the Wellcome Trust praising the ICO’s decision not to take action against the Department. We are assured that “the data is seen as crucial for vital research projects”. The exchange of letters with privacy campaigners that led up to this decision can be found here, here, here, here, here, here, and here.
An early portent of this u-turn was reported here in 2014 when officials reckoned that the only way they could still do administrative tasks such as calculating doctors’ bonuses was to just pretend that the data are anonymous even though they know it isn’t really. Then, after the care.data scandal showed that a billion records had been sold to over a thousand purchasers, we reported here how HES data had also been sold and how the minister seemed to have misled parliament about this.
I will be talking about ethics of all this on Thursday. Even if ministers claim that stolen medical records are OK to use, researchers must not act as if this is true; if patients end up trusting doctors as little as we trust politicians, then medical research will be in serious trouble. There is a video of a previous version of this talk here.
Meanwhile, if you’re annoyed that Jeremy Hunt proposes to ignore not just your privacy rights but your express wishes, you can send him a notice under Section 10 of the Data Protection Act forbidding him from disclosing your data. The Department has complied with such notices in the past, albeit with bad grace as they have no automated way to do it. If thousands of people serve such notices, they may finally have to stand up to the drug company lobbyists and write the missing software. For more, see here.
In two weeks’ time we’re starting an open course in security economics. I’m teaching this together with Rainer Boehme, Tyler Moore, Michel van Eeten, Carlos Ganan, Sophie van der Zee and David Modic.
Over the past fifteen years, we’ve come to realise that many information security failures arise from poor incentives. If Alice guards a system while Bob pays the cost of failure, things can be expected to go wrong. Security economics is now an important research topic: you can’t design secure systems involving multiple principals if you can’t get the incentives right. And it goes way beyond computer science. Without understanding how incentives play out, you can’t expect to make decent policy on cybercrime, on consumer protection or indeed on protecting critical national infrastructure
We first did the course last year as a paid-for course with EdX. Our agreement with them was that they’d charge for it the first time, to recoup the production costs, and thereafter it would be free.
So here it is as a free course. Spread the word!
At our security group meeting on the 19th August, Sergei Skorobogatov demonstrated a NAND backup attack on an iPhone 5c. I typed in six wrong PINs and it locked; he removed the flash chip (which he’d desoldered and led out to a socket); he erased and restored the changed pages; he put it back in the phone; and I was able to enter a further six wrong PINs.
Sergei has today released a paper describing the attack.
During the recent fight between the FBI and Apple, FBI Director Jim Comey said this kind of attack wouldn’t work.
When Lying Feels the Right Thing to Do reports three studies we did on what made people less or more likely to submit fraudulent insurance claims. Our first study found that people were more likely to cheat when rejected; the other two showed that rejected claimants were just as likely to cheat when this didn’t lead to financial gain, but that they felt more strongly when there was no money involved.
Our research was conducted as part of a broader research programme to investigate the deterrence of deception; our goal was to understand how to design better websites. However we can’t help wondering whether it might shine some light on the UK’s recent political turmoil. The Brexit campaigners were minorities of both main political parties and their anti-EU rhetoric had been rejected by the political mainstream for years; they had ideological rather than selfish motives. They ran a blatantly deceptive campaign, persisting in obvious untruths but abandoning them promptly after winning the vote. Rejection is not the only known factor in situational deception; it’s known, for example, that people with unmet goals are more likely to cheat than people who are simply doing their best, and that one bad apple can have a cascading effect. But it still makes you think.
The outcome and aftermath of the referendum have left many people feeling rejected, from remain voters through people who will lose financially to foreign residents of the UK. Our research shows that feelings of rejection can increase cheating by 15-30%; perhaps this might have measurable effects in some sectors. How one might disentangle this from the broader effects of diminished social solidarity, and from politicians simply setting a bad example, could be an interesting problems for social scientists.
The Royal Society has just published a report on cybersecurity research. I was a member of the steering group that tried to keep the policy team headed in the right direction. Its recommendation that governments preserve the robustness of encryption is welcome enough, given the new Russian law on access to crypto keys; it was nice to get, given the conservative nature of the Society. But I’m afraid the glass is only half full.
I was disappointed that the final report went along with the GCHQ line that security breaches should not be reported to affected data subjects, as in the USA, but to the agencies, as mandated in the EU’s NIS directive. Its call for an independent review of the UK’s cybersecurity needs may also achieve little. I was on John Beddington’s Blackett Review five years ago, and the outcome wasn’t published; it was mostly used to justify a budget increase for GCHQ. Its call for UK government work on standards is irrelevant post-Brexit; indeed standards made in Europe will probably be better without UK interference. Most of all, I cannot accept the report’s line that the government should help direct cybersecurity research. Most scientists agree that too much money already goes into directed programmes and not enough into responsive-mode and curiosity-driven research. In the case of security research there is a further factor: the stark conflict of interest between bona fide researchers, whose aim is that some of the people should enjoy some security and privacy some of the time, and agencies engaged in programmes such as Operation Bullrun whose goal is that this should not happen. GCHQ may want a “more responsive cybersecurity agenda”; but that’s the last thing people like me want them to have.
The report has in any case been overtaken by events. First, Brexit is already doing serious harm to research funding. Second, Brexit is also doing serious harm to the IT industry; we hear daily of listings posptoned, investments reconsidered and firms planning to move development teams and data overseas. Third, the Investigatory Powers bill currently before the House of Lords highlights the fact that surveillance debate in the West these days is more about access to data at rest and about whether the government can order firms to hack their customers.
While all three arms of the US government have drawn back on surveillance powers following the Snowden revelations, Theresa May has taken the hardest possible line. Her Investigatory Powers Bill will give her successors as Home Secretary sweeping powers to order firms in the UK to hand over data and help GCHQ hack their customers. Brexit will shield these powers from challenge in the European Court of Justice, making it much harder for a UK company to claim “adequacy” for its data protection arrangements in respect of EU data subjects. This will make it still less attractive for an IT company to keep in the UK either data that could be seized or engineering staff who could be coerced. I am seriously concerned that, together with Brexit, this will be the double whammy that persuades overseas firms not to invest in the UK, and that even causes some UK firms to leave. In the face of this massive self-harm, the measures suggested by the report are unlikely to help much.
If the UK leaves the European Union, it will cost Cambridge University about £100m, or about 10% of our turnover.
I present the details in an article today in the Cambridge News.
I reckon we will lose at least £60m of the £69m we get in European grants, at least £20m of our £237m fee income (most of which is from foreign students), at least £10m from Cambridge Assessment and Cambridge University Press, and £5m each from industry and charities. Although I’m an elected member of Council (the governing body) and the committee that sets the budget, all this comes from our published accounts.
And my estimates are conservative; the outcome could easily be worse, especially if foreign students desert us, or just can’t get visas after a popular vote against immigration.
Now everyone on Britain pays on average £4 a year to the EU and gets £2 back. The net contribution of £2 amounts to £12.5m for a town the size of Cambridge. The University alone is getting more than four times that back directly, and yet more indirectly. And the same goes for many other university towns too; even Newcastle gets more than would be raised by everyone in the city paying £2 a year.
But this is not just about money; it’s about who we are, and also about what other people perceive us to be. If Britain votes to leave Europe following a xenophobic campaign against immigrants, people overseas may conclude that Britain is to longer a cool place to study, or to start a research lab. Even some of the people already here will leave. We will do the best we can to keep the flame alight, but it will be very much harder for Cambridge to remain a world-leading university.
See also the Cambridge News editorial, and my piece yesterday on Brexit and tech.
The debate on whether Britain should leave the EU has largely ignored a factor of huge importance to the tech industry – network effects.
So I’ve written an article on what Brexit means for the tech industry from the viewpoint of information economics.
Network effects mean that the value of a transaction often depends on how many other people make similar transactions. They make our industry prone to monopolies. They ensure that the UK, with 1% of world population and 3% of GDP, has little influence on tech markets, which are mostly global. But the EU has real clout; Silicon Valley sees it as the world privacy regulator, as Washington doesn’t care and no-one else is big enough to matter. And most of the other regulations that IT people find annoying, from IP laws to export controls, are also embedded in international treaties. We can’t just tear up the annoying “red tape”, as the Brexit crowd suggest.
Brexit would not only diminish our influence on the laws that affect tech – many of which reflect negative network effects. It would make startups more expensive, so UK firms would have a harder time exploiting the positive network effects that are often the key to success. And it would damage the successful tech clusters we do have in Cambridge and in London.
Tech clusters need a number of things to thrive; and it’s not just technical network effects that matter, but labour-market network effects too. And there’s quite a lot of research on that. As good engineers can earn good money and live wherever we want, we congregate in places that are good places to live. They are always open and liberal places, where it’s fine to be from an ethnic minority, or an immigrant, or gay. What would the world’s best and brightest engineers think about moving to Britain if we vote for xenophobia on Thursday?
The article is in Computer Weekly, and there’s also a pdf here.