Responsible vulnerability disclosure in Europe

There is a report out today from the European economics think-tank CEPS on how responsible vulnerability disclosure might be harmonised across Europe. I was one of the advisers to this effort which involved not just academics and NGOs but also industry.

It was inspired in part by earlier work reported here on standardisation and certification in the Internet of Things. What happens to car safety standards once cars get patched once a month, like phones and laptops? The answer is not just that safety becomes a moving target, rather than a matter of pre-market testing; we also need a regime whereby accidents, hazards, vulnerabilities and security breaches get reported. That will mean responsible disclosure not just to OEMs and component vendors, but also to safety regulators, standards bodies, traffic police, insurers and accident victims. If we get it right, we could have a learning system that becomes steadily safer and more secure. But we could also get it badly wrong.

Getting it might will involve significant organisational and legal changes, which we discussed in our earlier report and which we carry forward here. We didn’t get everything we wanted; for example, large software vendors wouldn’t support our recommendation to extend the EU Product Liability Directive to services. Nonetheless, we made some progress, so today’s report can be seen a second step on the road.

Raising a new generation of cyber defenders

 

Over the past few years we launched and ran two university-level hacking competitions in  order to attract bright students to our field, with the long term goal of addressing the skills gap in cyber security.

Analysts estimate that, globally, over the next few years, in the field of cyber security there will be a gap of over a million people between the positions that need filling and the people with the skills to fill those positions.

In 2015 we founded the international Cambridge2Cambridge cyber security challenge, in collaboration with MIT CSAIL, which first took place at MIT, and then in 2016 the UK-level Inter-ACE among the UK ACE-CSRs, which first took place at the University of Cambridge. The Inter-ACE has now expanded beyond the ACEs and the C2C admits university students from anywhere in the world. None of this would have been possible without strong cooperation between academia, government and industry. We are grateful to our many supporters, who are all credited in the report.

After three years, my precious collaborators Graham Rymer and Michelle Houghton have moved on to new jobs and it is time for someone else to pick up the torch. To help our successors, today we publish a comprehensive technical report distilling our experience running these events for the past three years. We wrote it for all those who share
our vision and goals and who wish to take these competitions forward: we hope they will find it useful and it will help them make future editions even better. It contains a detailed chronicle of what we did and an extensive list of lessons learnt. Attendees of the Security and Human Behavior 2018 workshop will have heard me speak about some of the associated challenges, from fostering cooperation to redressing gender balance to preventing cheating, with detours into Japanese swordsmanship and Plato.

The extensive appendices contain a wealth of training material including write-ups of our practice CTFs and of the Inter-ACE 2018 for which we developed the problems in-house, as well as the latest course notes for the binary reverse engineering training seminar that we ran in Cambridge several times over the years, initially for our own students and then for hundreds of ACE-CSR participants.

We hope you will enjoy our report and that it will inspire you to contribute to future events in this series, whether as a participant, host or supporting institution, and keep the momentum going.

Frank Stajano, Graham Rymer, Michelle Houghton. “Raising a new generation of cyber defenders—The first three years of the Cambridge2Cambridge and Inter-ACE cyber security competitions”. University of Cambridge Technical Report UCAM-CL-TR-922, June 2018, 307 pages. http://www.cl.cam.ac.uk/techreports/UCAM-CL-TR-922.pdf

 

Third Annual Cybercrime Conference

The Cambridge Cybercrime Centre is organising another one day conference on cybercrime on Thursday, 12th July 2018.

We have a stellar group of invited speakers who are at the forefront of their fields:

They will present various aspects of cybercrime from the point of view of criminology, policy, security economics, law and industry.

This one day event, to be held in the Faculty of Law, University of Cambridge will follow immediately after (and will be in the same venue as) the “11th International Conference on Evidence Based Policing” organised by the Institute of Criminology which runs on the 10th and 11th July 2018.

Full details (and information about booking) is here.

Hiring for the Cambridge Cybercrime Centre

We have three open positions in the Cambridge Cybercrime Centre: https://www.cambridgecybercrime.uk.

We wish to fill at least one of the three posts with someone from a computer science, data science, or similar technical background.

BUT we’re not just looking for computer science people: to continue our multi-disciplinary approach, we wish to fill at least one of the three posts with someone from a criminology, sociology, psychology or legal background.

Details of the posts, and what we’re looking for are in the job advert here: http://www.jobs.cam.ac.uk/job/17827/.

Bitcoin Redux: crypto crime, and how to tackle it

Bitcoin Redux explains what’s going wrong in the world of cryptocurrencies. The bitcoin exchanges are developing into a shadow banking system, which do not give their customers actual bitcoin but rather display a “balance” and allow them to transact with others. However if Alice sends Bob a bitcoin, and they’re both customers of the same exchange, it just adjusts their balances rather than doing anything on the blockchain. This is an e-money service, according to European law, but is the law enforced? Not where it matters. We’ve been looking at the details.

In March we wrote about how to trace stolen bitcoin, describing new tools that enable us to track crime proceeds on the blockchain with more precision than before. We waited for victims of bitcoin theft and fraud to come to us, so we could test our tools on real cases. However in most of them it was not clear that the victims had ever owned any bitcoin at all.

There are basically three ways you could try to hold a bitcoin. You could buy one from an exchange and get them to send it to a wallet you host yourself, but almost nobody does that.

You could buy one from an exchange and get the exchange to keep the keys for you, so that the asset was unique to you and they were only guarding it for you – just like when you buy gold and the bullion merchant then charges you a fee to guard your gold in his vault. If the merchant goes bust, you can turn up at the vault with your receipt and demand your gold back.

Or you could buy one from an exchange and have them owe you a bitcoin – just as when you put your money in the bank. The bank doesn’t have a stack of banknotes in the vault with your name on it; and if it goes bust you have to stand in line with the other creditors.

It seems that most people who buy bitcoin think that they’re operating under the gold merchant model, while most exchanges operate under the bank model. This raises a whole host of issues around solvency, liquidity, accounting practices, money laundering, risk and trust. The details matter, and the more we look at them, the worse it seems.

This paper will appear at the Workshop on the Economics of Information Security later this month. It contains eight recommendations for what governments should be doing to clean up this mess.

New security lecturer

We’re delighted to announce that the new security lectureship we advertised has been offered to Alice Hutchings, and she’s accepted. We had 52 applicants of whom we shortlisted three for interview.

Alice works in the Cambridge Cybercrime Centre and her background is in criminology. Her publications are here. Her appointment will build on our strengths in research on cybercrime, and will complement and extend our multidisciplinary work in the economics and psychology of security.

Failure to protect: kids’ data in school

If you care about children’s rights, data protection or indeed about privacy in general, then I’d suggest you read this disturbing new report on what’s happening in Britain’s schools.

In an ideal world, schools should be actively preparing pupils to be empowered citizens in a digital world that is increasingly riddled with exploitative and coercive systems. Instead, the government is forcing schools to collect data that are then sold or given to firms that exploit it, with no meaningful consent. There is not even the normal right to request subject access to you can check whether the information about you is right and have it corrected if it’s wrong.

Yet the government has happily given the Daily Telegraph fully-identified pupil information so that it can do research, presumably on how private schools are better than government ones, or how grammar schools are better than comprehensives. You just could not make this up.

The detective work to uncover such abuses has been done by the NGO Defenddigitalme, who followed up some work we did a decade and more ago on the National Pupil Database in our Database State report and our earlier research on children’s databases. Defenddigitalme are campaigning for subject access rights, the deletion of nationality data, and a code of practice. Do read the report and if you think it’s outrageous, write to your MP and say so. Our elected representatives make a lot of noise about protecting children; time to call them on it.